Facebook Approved Pro-Genocide Ads in Kenya After Claiming to Foster 'Safe and Secure' Elections

Kenya's national cohesion watchdog has threatened to suspend the social network from the country in a week if it doesn't mitigate hate speech.

We may earn a commission from links on this page.
Meta CEO Mark Zuckerberg speaks at Georgetown.
Meta CEO Mark Zuckerberg speaks at Georgetown University in Washington, Thursday, Oct. 17, 2019.
Photo: Nick Wass (AP)

Kenya’s national cohesion watchdog threatened to suspend Facebook from the country Friday if it doesn’t mitigate hate speech ahead of the country’s general elections next month. The regulator has given the company one week to remediate the problem, which included Facebook’s approval of ads advocating for ethnic cleansing. Human rights organizations and the Facebook whistleblower are calling on Facebook to immediately suspend all advertising in Kenya and take other emergency steps.

The National Cohesion and Integration Commission (NCIC), a Kenyan agency founded to mitigate ethnic violence and promote national healing in the wake of the 2007-08 post-election crisis, told reporters on Friday that Facebook was “in violation of the laws of our country.”

Advertisement

“They have allowed themselves to be a vector of hate speech and incitement, misinformation and disinformation,” Danvas Makori, an NCIC commissioner said during a briefing.

Advertisement

Facebook claimed last week to have cracked down on harmful content in the country, issuing a press release praising itself for the many ways it was tackling problematic content. But immediately after, the company approved ads run in both English and Swahili crafted specifically to instigate ethnic violence in Kenya, human rights groups said.

Advertisement

Nonprofit groups Global Witness and Foxglove revealed Thursday that a third independent test had proven Facebook incapable of detecting language designed to incite violence around the August elections. Specifically, the groups said, Meta approved ads on Facebook in both Swahili and English that included calls to rape and behead Kenyan citizens along ethnic lines. —

“If Mark Zuckerberg chooses to sit on his hands, it will make it clear that for him, and for Facebook, American lives matter — Kenyan lives don’t,” said Foxglove director Cori Crider.

Advertisement

The NCIC said Friday that that the results of the Global Witness/Foxglove tests had corroborated its own internal findings.

“Suspending ads and rolling out ‘break glass’ measures are steps Facebook can take to reduce the risk to the Kenyan election today. That’s what we’re calling for,” Crider said, pointing to steps taken by the platform following the U.S. insurrection on Jan 6, 2021.

Advertisement

Global Witness said in a statement that it chose deliberately not to publish the exact language used in the tests conducted on Facebook, but described the ads as “dehumanising, comparing specific tribal groups to animals and calling for rape, slaughter, and beheading.”

Advertisement

The groups said the tests focused exclusively on Facebook’s ad system. This allowed them to evaluate Facebook’s content approval process while avoiding exposing any real users to the virulent content.

Global Witness said that after Facebook became aware of the tests, the company published a press release describing the myriad measures it has allegedly taken to “ensure a safe and secure general election in Kenya.”

Advertisement

Meta claims in the statement to use a “combination of artificial intelligence, human review and user reports” to tackle harmful content and said that it’s partnered with civil society organizations, such as the Kenya Women Parliamentary Association, “to ensure a safer experience across our technologies.”

Global Witness said that following Meta’s statement, it submitted additional ads to discover if Facebook had actually rolled out improvements that would better detect hate speech in ads. Facebook failed those tests as well.

Advertisement

“Once again the ads we resubmitted in Swahili and English were approved,” Global Witness said.

In a statement to Gizmodo, a Meta spokesperson reiterated that the company has taken “extensive steps” to detect hate speech on the platform and that it is “intensifying these efforts” ahead of the Kenyan national elections. “Despite these efforts, we know that there will be examples of things we miss or we take down in error, as both machines and people make mistakes,” they said.

Advertisement

The spokesperson said Meta has teams that monitor for such errors, adding they are addressed “as quickly as possible.”

Facebook whistleblower Frances Haugen echoed Global Witness and Foxglove’s call for a suspension of all advertising in Kenya.

Advertisement

“The safety of the Kenyan election matters every bit as much as the United States,” Haugen said. “After January 6, Facebook staff took urgent steps to restrict the most dangerous features on the platform, reducing violence and hate on the platform. They have also previously suspended ads. Given we now have evidence Facebook did not invest in basic Ads safety measures in Kenya—Facebook should suspend ads until the Election at a minimum.”

Kenya’s general elections to elect a new president and members of its National Assembly will take place on August 9, 2022.

Advertisement

Update, 2:30pm: Added a comment from a Meta spokesperson provided after publication.

Advertisement