Meta watchdog says grassroots fact checks risk harm to users
Facebook parent Meta announced last year that it would end its use of external fact-checkers in the US.
That scheme had employed third parties including AFP to expose misinformation.
Instead, Meta said it would ask ordinary users to verify controversial claims in a system known as "community notes", aping methods on X and other social networks.
If rolled out worldwide, that scheme "could... pose significant human rights risks and contribute to tangible harms," Meta's Oversight Board said in a Thursday advisory.
That was especially true in "repressive human rights regimes, in particular electoral contexts and in ongoing crisis and conflict situations," it added.
AFP was one of 23 organisations whose public comments were accepted by the Oversight Board as it prepared its advisory.
The independent board is often referred to as Instagram and WhatsApp owner Meta's "supreme court", ruling on moderation decisions and advising on policy.
Created and voted on by ordinary social media users, community fact-checks generally rely on independent journalism to back up their claims.
This is difficult or impossible in repressive regimes, the board noted.
During conflicts, some groups may be cut off from access and unable to weigh in with their side of the story, they added.
The board recommended that community notes should not be introduced where there is active fighting or widespread obstacles to getting online.
Free media and civil society are also needed for ordinary people to fact-check claims in the midst of elections.
Without them, "the program risks publishing misleading notes", the board said.
And in some parts of the world, "malicious actors have repeatedly demonstrated the ability to coordinate large numbers of accounts to promote deceptive information" and could do so via Meta's community notes, it added.
"This risk will become more acute as artificial intelligence facilitates the scaled creation and operation of these networks," the board warned, suggesting that Meta rule out countries with histories of disinformation campaigns.
Other factors to take into account included language barriers and political polarisation.
The board urged Meta to test for "risks related to contributor anonymity, coordinated disinformation campaigns and gaming of the system, language representation and contributor participation" before launching community notes in a country.
It should also grant outside researchers access to data on the scheme.