- Meta has told the Brazilian Attorney General that Community Notes will only be expanded on beyond the US once it has been properly tested.
- Despite this, the Attorney General says the proposed changes don’t comply with the country’s laws.
- The Attorney General’s office, justice ministry and human rights ministry as well as the presidential communication service will hold a public hearing to discuss Meta’s new policy.
Last week Meta CEO, Mark Zuckerberg, took to his social media platforms to announce that fact checking would be phased out on Facebook, Instagram, and Threads. Almost as soon as the words passed his lips, experts began debating just how bad of an idea this was.
Unfortunately for Zuckerberg, this move also prompted lawmakers beyond US borders to get riled up and demand answers. Brazil is one such nation which gave Meta 72 hours to explain its fact-checking policy in Brazil.
As a reminder, Meta will ditch third-party fact checking for a Community Notes-style solution. Here, users will be tasked with correcting false statements on Meta platforms similar to what X does. As anybody who has used X recently will know, Community Notes can be great but often moves too slowly to correct false statements if a Community Note appears at all.
With that in mind, Meta has been lambasted for the decision to kick fact checkers to the curb but it seems that Zuckerberg’s original statement related specifically to the US. This is because in a response to the Brazilian government, the tech firm wrote that, “Meta would like to make it clear that it is, at this time, only ending its independent fact-checking program in the United States, where we will test and refine Community Notes before beginning any expansion to other countries”.
However, Brazil’s Attorney General said the proposed changes don’t abide by the country’s laws nor do they protect the rights of citizens. As such, the Attorney General’s office, justice ministry and human rights ministry as well as the presidential communication service will hold a public hearing to discuss Meta’s new policy. According to Barron’s that hearing was scheduled for Thursday but more time is needed to confirm participants.
It’s worth remembering that Brazil isn’t shy about banning social networks that don’t comply with its laws.
The problem with asking users to police content
As we mentioned, Community Notes can be great if they are timeous and accurate, but unfortunately, that isn’t always the case.
This is largely due to how this fact checking function works. In order for a post to get a Community Note on X, contributors need to vote on whether it is helpful or not. For a post claiming tomato sauce is just red mustard for instance, this is easy but it gets more complex when the subject of a proposed note is divisive.
This process of writing a note, citing sources and then publishing it for approval takes time and in that time, misinformation can be spread far and wide. Even once a Community Note has been published it’s unlikely to have the same reach as the original post did without a note.
“Even when Community Notes are helpful, it’s estimated that the misleading content they are attached to is “often viewed 5 to 10 times more” than the note itself,” the Centre for Countering Digital Hate wrote about X’s Community Notes in 2024.
“Our own research has highlighted similar weaknesses [in Community Notes]. When we found that just 50 of Elon Musk’s own posts that fact-checkers say promote false or misleading claims about elections had amassed 1.2 billion views, we also found that none had Community Notes. Similarly, our analysis of 1,060 posts from accounts that were influential in promoting false or misleading claims that contributed to riots targeting migrants and Muslims in the UK found that just one displayed a Community Note,” the CCDH added.
This is worrying especially when you consider how many more people are on Meta platforms compared to X.
Facebook, WhatsApp and Instagram are massive social platforms with 3.29 billion people using at least one Meta app every day. This means that unless the process of publishing notes is robotically efficient which – considering it will be run by humans – we doubt it will be.
One of the major concerns we have is that by ditching fact checkers, Meta is setting itself up for a major crisis.
Between 2016 and 2017, as many as 43 000 Rohingya people were killed in a genocide in Myanmar. During that time, Facebook served as a platform for military and others to spread hate speech about the Rohingya people, something Meta (Facebook at the time) admitted it could have done more to prevent the genocide from happening.
While Zuckerberg argues that the amount of content fact checkers remove is just one percent of the content produced everyday, without guardrails in place, we suspect that figure will rise as fact checkers are ousted from Meta.