Meta’s Shift to Community Notes Could Lead to Job Losses and Impact Misinformation Control in Africa
Meta’s decision to replace its long-standing fact-checking model with community notes across its platforms—Instagram, Facebook, and Threads—has raised concerns among content moderation companies, fact-checkers, and civil rights activists in Africa.
This shift, announced on January 7, 2025, is seen as a significant move that could lead to job losses, particularly in Kenya, Nigeria, Egypt, and South Africa, where contractors employed by content moderation firms are at risk of losing their positions.
The move comes in response to claims that Meta’s fact-checking program, launched in 2016, had been used to “censor” content. Instead of relying on third-party fact-checkers, Meta will now rely on users to flag misleading content, empowering community members to add context to potentially false or harmful posts.
However, critics argue that the shift could have serious consequences for the fight against misinformation in Africa.
In countries like Kenya, where social media platforms like WhatsApp, Facebook, and Instagram have millions of users, unchecked disinformation could spread without a robust fact-checking mechanism in place.
Emmanuel Chenze, COO of African Uncensored, warned that the loss of fact-checking capabilities could lead to a crisis in African democracies, especially with upcoming elections in Tanzania, Uganda, and Kenya.
“We’ve seen the mess caused by the lack of fact-checking initiatives during the 2017 election period,” Chenze said, referencing the “Real Raila” videos and the influence of Cambridge Analytica. “Those psyops had no countermeasures. Now, we risk returning to that situation without a proper framework to address misinformation.”
The decision to abandon fact-checking is likely to hurt content moderation companies like PesaCheck, which receives a significant portion of its funding from Meta.
In 2023, Meta contributed 43% of PesaCheck’s funding, and in 2022, Meta provided over half of the organisation’s total financial support. With the loss of funding from Meta, organisations like PesaCheck may struggle to maintain their operations, hindering their ability to fight misinformation and safeguard public discourse.
“I fear the implications of this shift, especially when it comes to job losses for content moderators and the reduced capacity of organisations to do their critical work,” Chenze added. “It’s a grim outlook for a region already battling the spread of misinformation.”
Meta’s new approach—similar to X’s (formerly Twitter) 2023 community notes system—places the power to flag misleading content in the hands of users. While this system has been praised in certain contexts, critics argue that it may not be as effective in regions where misinformation is often politically motivated and spreads rapidly.
For instance, the reliance on community notes could allow false narratives to go unchecked, especially when political or social interests manipulate the process.
This change also affects the financial relationships Meta has with third-party content moderation and fact-checking firms, such as Africa Check and PesaCheck, which rely heavily on funding from the tech giant. Without this financial support, these organisations could struggle to address harmful content and the spread of fake news.
While Meta’s shift to community notes could offer benefits in terms of user engagement and content moderation transparency, experts worry it will exacerbate the misinformation crisis in Africa.
The fact-checking ecosystem, which has been essential in countering the spread of disinformation, could be weakened by this move, leaving many African nations vulnerable to the dangers of unchecked falsehoods.
Meta’s decision to move to community-driven content moderation is also raising alarms about the future of job security for content moderators. Several outsourcing firms that previously worked with Meta, such as Sama and Majorel in Kenya, have already exited the content moderation business.
Sama, which specialised in flagging harmful content, disclosed that content moderation accounted for less than 4% of its business before it shifted to AI data labeling. The company faced criticism for its treatment of workers and for failing to provide sufficient psychological support for moderators who flagged violent or harmful content.
The wider implications of Meta’s shift could also affect its standing in international legal and regulatory environments.
The European Union’s stricter regulations, such as the Digital Services Act, require platforms like Meta to address illegal content or face significant fines.
As Meta moves forward with its plans to implement community notes, the European Commission has stated that it is closely monitoring the situation, particularly as Meta’s model is rolled out in the U.S.