Facebook Learns From Cambridge Analytica Scandal, Intros Measures to Combat Misinformation Ahead of Kenya Polls

Mark Zuckerberg
Facebook CEO Mark Zuckerberg. Courtersy, Vox

Meta-owned Facebook does not have a good track record as far as managing misinformation, hate speech and other social media vices is concerned. The platform was battling the Cambridge Analytica saga back in 2017, when it was reportedly determined that the platform allowed election tampering in some countries, including Kenya.

Not to battle the same concerns this time around, the platform has launched some measures to ensure that it will be used as intended during the August Polls in Kenya. It says that it has been doing some work in terms of ensuring safe and secure polls when that time comes.

Over the past year, a specialist team of local experts have been working closely with election authorities and trusted partners, and a dedicated Kenyan Elections Operation Centre activated as part of its ongoing work in supporting major elections around the world – Meta

 To this end, here are the investments and work that the social media giant has put in place for the said purpose, in verbatim:

Removing Harmful Content to Keep Users Safe

In order to quickly identify and remove content that violates its Community Standards, Meta uses a combination of artificial intelligence, human review and user reports. Quadrupling the size of its global team focused on safety and security to more than 40,000 people and hiring more content reviewers, including in Swahili, in the six months leading up to April 30, 2022, Meta took action on more than 37,000 pieces of content for violating its Hate Speech policies on Facebook and Instagram in Kenya. During that same period, Meta also took action on more than 42,000 pieces of content that violated its Violence & Incitement policies.

Protecting Female Public Figures and Human Rights Defenders

Informed by local challenges around increased abuse against female public figures, Meta formed a working group for the protection of female public figures during the Kenya elections. Partnering with local civil society organizations such as Kenya Women Parliamentary Association (KEWOPA), Pollicy, and UN Women, Meta has trained women members of Parliament, aspirants and human rights defenders to utilise its safety tools and resources to ensure a safer experience across its platforms. 

Reducing Problematic Content Across Facebook, Instagram, WhatsApp, and Messenger

To reduce misinformation and lower the risk of problematic content in Kenya ahead of and during the elections, Meta is temporarily reducing the distribution of content across Facebook and Instagram from those who have repeatedly or severely violated its policies. In 2021 Meta also announced new rules for WhatsApp including reducing the number of people users can send a highly forwarded WhatsApp message to, to just one chat at a time. Since then, Meta has seen a 70% drop in the number of highly forwarded messages on WhatsApp. Meta has also introduced this forward limit on Messenger, so messages can only be forwarded to five people or groups at a time. 

Combating Misinformation and False News

As part of its elections work, Meta removes the most serious kinds of misinformation from Facebook and Instagram, such as content that is intended to suppress voting or could contribute to imminent violence or physical harm. During the Kenyan elections, based on guidance from local partners, this will specifically include false claims that people with weapons are guarding polling stations, false claims that polling stations have been damaged and photos and videos shared out of context depicting ballot-stuffing or violence. Additionally, through its third-party fact-checking programme, and for content that doesn’t violate these particular policies, Meta has partnered with independent third-party fact-checkers in Kenya — AFP, Pesa Check and Africa Check, who review content in both English and Swahili. When a piece of content is reviewed and rated as false, Meta reduces its distribution and adds a warning label with additional information.

Supporting Digital Literacy in Kenya

Working with local partners to improve digital and media literacy in Kenya, Meta has launched programs like My Digital World and partnered with iEARN Kenya to raise awareness amongst youth, teachers, parents and guardians on topics such as online safety, privacy, digital citizenship, news and media literacy. Meta is also working with UNESCO, through the EU-funded project on Social Media for Peace in Kenya with this programme aimed at addressing concerns around the use of digital communication tools as platforms to spread harmful content. In the lead up to, and during the elections Meta has also rolled out a radio campaign in multiple local languages including Luo, Kalenjin, Kikuyu, Swahili and English, focused on educating listeners on how to spot hate speech and misinformation, and what actions to take. 

Making Political Advertising More Transparent

Meta’s Ad Transparency tools help people understand who’s behind the political ads they see on Facebook and Instagram. Advertisers who want to run political ads in Kenya must undergo a verification process to verify their identity and that they live in the country. Additional checks are run to ensure their compliance with Meta’s policies. All political ads must be labeled with a “Paid for by” disclaimer to show who’s behind it, with these also included in the Ads Library so everyone can see what ads are running, information about targeting and find out how much money was spent. 

Promoting Civic Engagement 

Helping to build informed and engaged communities is central to our work around elections. In Kenya, we’ll have an “I Voted” sticker on Instagram, and launching on 9th August, Election Day, we’ll remind people in the country that it’s time to vote with a notification on top of their Facebook Feed as well.