Fake news didn't play a big role in NZ's 2023 election – but there was a rise in 'small lies'

8:08 pm on 15 November 2023

By Mona Krewel of The Conversation

The Conversation
(

More than 4000 Facebook posts were analysed for the study. Photo: AFP or licensors

The threat of disinformation on social media in the lead-up to New Zealand's 2023 election loomed large for the Electoral Commission and academics studying fake news.

So how bad did it really get?

As part of the New Zealand Social Media Study, we analysed more than 4000 posts on Facebook from political parties and their leaders. Our study focused on the five weeks ahead of election day.

What we found should give New Zealanders some comfort about the political discourse on social media. While not perfect, there was not as much misinformation (misleading information created without the intent to manipulate people) and disinformation (deliberate attempts to manipulate with false information) as everyone feared.

Looking for fake news and half truths

To identify examples of both of those, a team of research assistants analysed and fact-checked posts, classifying them as either "not including fake news" or "including fake news".

Fake news posts were defined as completely or mostly made up, and intentionally and verifiably false.

An example of this type of disinformation would be the "litter box hoax", alleging schools provided litter boxes for students who identified as cats or furries.

Originating from overseas sources, this story has been debunked multiple times. In New Zealand, this hoax was spread by Sue Grey, leader of the NZ Outdoors & Freedoms Party.

In cases of doubt, or when the research assistants couldn't prove the information was false, they coded the posts as "not including fake news". The term "fake news" was therefore reserved for very clear cases of false information.

If a post did not include fake news, the team checked for potential half-truths. Half-truths were defined as posts that were not entirely made up, but contained some incorrect information.

The National Party, for example, put up a post suggesting the Ministry of Pacific Peoples had hosted breakfasts to promote Labour MPs, at the cost of more than $50,000. While the ministry did host breakfasts to explain the most recent budget, and the cost was accurate, there was no indication the purpose of this event was to promote Labour MPs.

How 2023's election compared to 2020

At the beginning of the campaign, the proportion of what we identified as fake news being published on Facebook by political parties and their leaders was 2.5 percent - similar to what we saw in 2020.

The proportion of fake news posts then dropped below 2 percent for a long period and even fell as low as 0.7 percent at one point in the campaign, before rising again in the final stretch. The share of fake news peaked at 3.8 percent at the start of the last week of the campaign.

Over the five weeks of the campaign, we identified an average of 2.6 percent of Facebook posts by political parties and their leaders in any given week qualified as fake news. In 2020, the weekly average was 2.5 percent, which means the increase of fake news was minimal.

The sources of much of the outright fake news were parties on the fringes. According to our research, none of the major political parties were posting outright lies.

But there were posts from all political parties assessed as half-truths.

Half-truths stayed well below 10 percent during the five weeks we looked at, peaking at 6.5 percent in the final week. On average, the weekly share of half-truths was 4.8percent in 2023, while in 2020 it was 2.5 percent.

So while the number of "big lies" - also known as "fake news" - did not increase in 2023 compared to 2020, the number of "small lies" in political campaigns is growing.

All of the political parties took more liberties with the truth in 2023 than they did in 2020.

Playing on emotions and oversimplifying

More than a third of all misleading posts in 2023 were emotional (37 percent), targeting voters' emotions through words or pictures. Some 26 percent of the social media posts jumped to conclusions, while 23 percent oversimplified the topics being discussed. And 21 percent of the posts cherry-picked information, meaning the information presented was incomplete.

Some of the social media posts we identified as fake news or half-truths used pseudo-experts: people with some academic background, but who are not qualified to be expert witnesses on the topic under discussion (18 percent).

We also saw anecdotes of unclear origin, instead of scientific facts (15 percent), while 7 percent had unrealistic expectations of science, such as expecting science to offer 100 percent certainty.

Some of the posts included the claim that the posts' authors had a silent majority behind them (5 percent). Another 5 percent of the social media posts identified as disinformation included personal attacks, rather than debating someone's arguments.

Staying vigilant

The levels of misinformation and disinformation on social media during the past two elections in New Zealand have been fairly low - and certainly no cause for panic. But that doesn't mean it will always stay that way.

On the one hand, we need to keep an eye on the social media campaigns in future elections and, in particular, monitor the development and use of misinformation and disinformation by political parties on the fringe.

* Mona Krewel is a senior lecturer in Comparative Politics and Te Herenga Waka - Victoria University of Wellington

Get the RNZ app

for ad-free news and current affairs