Facebook Workers Warned About Hateful Groups Well Before Capitol Riot

We may earn a commission from links on this page.
Image for article titled Facebook Workers Warned About Hateful Groups Well Before Capitol Riot
Photo: Drew Angerer (Getty Images)

Last week, Facebook finally announced it would stop recommending so-called “civic” (see: political) Groups to users in the wake of the violent siege on the U.S. capitol. But according to a new Wall Street Journal report, this decision only came well after data scientists within the company had raised the alarm that Groups had become a blight on the wider community.

The Journal’s report cites an internal August 2020 presentation given by the company’s data scientists that found roughly “70%” of the 100 most popular civic Groups in the U.S. were rendered “non-recommendable,” for various reasons. Sometimes they they promoted misinformation and “calls to violence,” and in other cases they were sources of “bullying and harassment.” The presentation noted that some of the most popular private Groups were those controlled by administrators who saw this sort of behavior as desirable.

Advertisement

Per the August report, some of the group’s admins show Group members ways to post violent material that Facebook’s hate-speech sniffing algorithms weren’t designed to catch; in some cases, Group members were encouraged to sneak vile content under the radar as comments under more benign content, rather than as standalone posts. Sometimes, Group admins simply threatened to ban anyone who reported any content within the group at all.

Advertisement

This August presentation, while damning, was only the latest indication of internal concerns at Facebook about the viability of Groups to leak outside Menlo Park. As the Journal put it: “Facebook executives were aware for years that tools fueling Groups’ rapid growth presented an obstacle to their effort to build healthy online communities.”

Advertisement

While Facebook COO Sheryl Sandberg continues to deny the platform’s role in stoking hate among those on the far-right, this August presentation is just the latest example of employees sounding the alarm over that exact issue. Immediately following the Capitol riot, for example, documents circulated by data scientists on the platform reported that user reports for “violent content” spiked ten times higher than they had been that morning, according to an another series internal posts reviewed by the Journal.

But despite the concerns of these rank-and-file employees, the company’s more senior executives kept their intervention in these Groups to a minimum, instead maintaining Facebook’s track record of only being an arbiter of truth when it’s most convenient.

Advertisement

Per the Journal, Facebook’s public policy team “balked” at the idea of taking any sort of action against the more popular conservative groups, even those that happened to be on the more violent side. When these data scientists made suggestions to restrict these particular groups, higher-ranking Facebook execs pointed out that these plans would directly hamper the platform’s most valued metric: growth. This sort of stonewalling eventually led to some company staffers sending daily reports of the company’s failure to police its Groups to Guy Rosen, Facebook’s VP of “integrity”-related issues.

While these higher-ups were dragging their feet, these were Groups were blatantly promoting conspiratorial garbage leading up to January 6th. One report from the Tech Transparency Project earlier this month found that in the aftermath of the 2020 election, multiple far-right Groups and pages began ramping up calls to “Take America Back” and to “#OccupyCongress” that continued up until the eventual riots.

Advertisement

Apparently, it only took an insurrection on the U.S. Capitol to get the platform to actually act. Aside from making its originally temporary hold on promoting political Groups permanent, Rosen told the Journal that it will be disabling some recruitment tools that the researchers cited as enabling these Groups’s rapid growth.

We’ve reached out to Facebook for comment and will update this piece when we hear back.

Advertisement