counter hit make

Commentary: Meta’s U-turn on fact-checking is dangerous, but perhaps there’s a silver lining

0 17

Advertisement

Commentary

Meta’s fact-checking rollback – combined with Donald Trump’s propensity for false claims and rising disinformation – will likely contribute to a surge of harmful content online, says former journalist and NMP Nicholas Fang.

Commentary: Meta’s U-turn on fact-checking is dangerous, but perhaps there’s a silver lining
File photo of Meta founder Mark Zuckerberg at a joint hearing of the Commerce and Judiciary Committees on Capitol Hill in Washington in April 2018. (AP Photo/Alex Brandon)

New: You can now listen to articles.

This audio is generated by an AI tool.

13 Jan 2025 06:00AM

SINGAPORE: The recent announcement by Meta founder Mark Zuckerberg that the company will get rid of fact-checking and scale back content moderation on its social media platforms including Facebook, Instagram and Threads is not entirely surprising.

After all, his move to introduce such controls were largely in response to public and stakeholder pressure in the wake of Donald Trump’s first successful attempt at becoming president of the United States in 2016.

Since then, Mr Zuckerberg has made comments that suggest he was never fully convinced of the company’s duty to moderate harmful content or to fact-check information shared by its users. The company has also been downsizing its content moderation and policy teams since 2022 as part of cost-cutting measures, and continues to do so on a rolling basis.

The latest announcement however is the clearest indication that Meta is bowing to political pressure ahead of the second Trump presidency, and comes on the heels of other moves that include promoting GOP-ally Joel Kaplan to head of policy. He had earlier appointed staunch Trump supporter Dana White, who runs the UFC mixed martial arts promotion, to Meta’s board.

Mr Zuckerberg also said he would work on issues of free speech with Trump, which is ironic given that, just four years ago, the once and future president was considered too dangerous to even be a Meta user.

DANGEROUS U-TURN

The changes will be rolled out first on Meta platforms in the US. But beyond domestic US politics, these developments should be a concern for anyone who uses their platforms – both in and out of the US.

Top on the list of concerns are the heightened risks of misinformation and disinformation, and their impact on truth, society and democracy.

Mr Zuckerberg’s announcement has been roundly met with criticism by disinformation experts, academics and fact-checking organisations, many of whom point to the fact that Meta’s efforts over the years have indeed helped to tamp down the spread of disinformation and harmful content.

While Meta’s platforms are just one part of the entire global social media ecosystem, they account for a massive 3 billion users around the world.

Most observers believe that, when juxtaposed against Trump’s propensity for amplifying false claims and fake news and the rising popularity of disinformation as a tool for foreign interference and influence operations, Meta’s move will likely contribute to a surge of harmful content online.

This is a distinct possibility in light of what has happened on X, formerly known as Twitter, which similarly eschewed fact-checking for its Community Notes solution. This places the onus on users to add context and corrections to other people’s posts.

Mr Zuckerberg has praised this approach in the past, even though it ended up with a mishmash of fact-checking, trolling and other community-driven behaviour on X, much of which has not been positive or healthy.

A ground-up community-led approach is also less likely to be efficient and effective when compared to professional fact-checkers dedicated to debunking a broad range of potential fake news.

By getting rid of them, Meta’s decision will also directly impact the operations of fact-checking organisations worldwide. The company had partnered with more than 90 such organisations globally to fact-check content on its platforms.

WHAT CAN BE DONE

This then begs the question of what can be done to address this potentially dangerous situation.

The likelihood of Meta being pressured into reversing its decision seems low, especially given the incoming Trump administration’s preference for a more unfettered social media landscape.

But perhaps this is not such a bad thing.

For full disclosure, I should mention that I have been running a leading independent fact-checking platform focused on Singapore since 2019.

In the course of our work, I’ve come to realise that it’s simply impossible to fact-check every false claim circulating in the public sphere.

Purveyors of fake news range from mischievous individuals to state and non-state actors seeking to interfere and influence target audiences. Enabled by ubiquitous social media platforms and technology such as artificial intelligence, the volume at which they are able to generate a tsunami of disinformation can easily overwhelm any attempts to counter it.

As such, even the most earnest and zealous fact-checkers employing a whack-a-mole strategy to take on any of such misinformation or disinformation will likely not move the needle significantly on a global scale.

Our focus instead has been on leveraging our fact-checks and other outreach efforts to educate the broader public on the threat of fake news, and to encourage a culture of emotional scepticism when it comes to suspect content that is encountered online and elsewhere. This form of inoculation seems the best approach in the longer term to ensure a safe and stable information environment.

A WHOLE-OF-NATION APPROACH

When one considers the future threats that loom on the horizon, it is clear that a holistic, whole-of-nation approach to developing solutions is needed.

As observed in other countries, fact-checking efforts too can be weaponised or politicised, with different sides of political divides deploying fact-checkers to undermine one another.

Beyond the increasing divisiveness this can encourage in societies, another effect can be to confuse the general public to the point that trust in any source of information – including credible platforms such as established media outlets – diminishes irreparably.

As such, what is needed for a long-term, sustainable and robust counter to misinformation and disinformation is for all stakeholders to come together to work for a common goal.

In Singapore, the government has enacted laws aimed at preventing online falsehoods from being spread. At the same time, educational efforts have been deployed by academic institutions and organisations such as the National Library Board to build greater news, media and information literacy.

This needs to be paired with the work by tech companies, independent fact-checkers and media outlets to ensure that sources of disinformation are restricted and disempowered, while also encouraging the development of high-quality, trusted and credible sources of factual information.

If there’s any small silver lining from Meta’s decision to pull back from fact-checking and content moderation and its subsequent effects – it’s that it serves as a reminder of the reality of the threats we face as a society from fake news and disinformation, and how an educated, informed and activated citizenry might be our best bet to overcoming current and future threats.

Nicholas Fang is a former journalist and Nominated Member of Parliament. He is also the founder and managing director of Black Dot Research, a market and social research agency that operates Singapore’s leading independent fact-checking platform.

Leave A Reply

Your email address will not be published.