35 Internal Code Words Facebook Uses to Talk About Its Users and Tools

The ongoing 'Facebook Papers' document leaks include a lot of internal company lingo. We rounded up some key examples.

We may earn a commission from links on this page.
Image for article titled 35 Internal Code Words Facebook Uses to Talk About Its Users and Tools
Photo: Kirill Kudryavtsev (Getty Images)

One of the most surreal parts of going through the mountain of documents captured from within Facebook’s walls by whistleblower Frances Haugen is seeing the words employees use when discussing some of the company’s most sensitive products and systems. Many of these names (CORGI! Yoda!) sound deceptively cute, while others sound more... sinister.

While the terms themselves are interesting simply because they are used internally, they also provide key insights into Facebook’s internal machinations and how the company thinks about the issues we’ve all come to know and loath. For the most part, these definitions are pulled from an internal glossary used by the company’s (now disbanded) Civic Integrity team, which were part of Haugen’s disclosures made to Congress and the Securities and Exchange Commission. Gizmodo, along with dozens of other news organizations, obtained redacted versions of these documents.

Advertisement

There are other terms in here, too, that don’t appear in the glossary but do appear frequently in some of the other documents provided for us. Gizmodo was able to define with the help of a former Facebook employee who spoke to us on the condition that they not be named.

Advertisement

With all of that out of the way, let’s get to those terms!

1. CORGI

This refers to a complex mathematical model that Facebook’s researchers came up with internally in order to find “clusters of users” that might be operating in an inauthentic way—like users that might be commenting a bit too frequently on each other’s posts. Based on this model (that, yes, is spelled like the dog breed), these researchers could identify likely bots and bad actors.

Advertisement

2. Bonjovi

Employees used this internal investigation tool to track—among other things—accounts on Instagram and Facebook that might be engaged in human trafficking. According to the internal documents we were provided, Bonjovi could be used to track a potential trafficker’s on-platform search activity, and a history of the profiles that said trafficker was viewing as a way to suss out who their potential victims might be.

Advertisement

3. H1/H2/H3

Because Facebook is such a massive operation with countless lines of code running at any given time, the company needs to push out any updates in a series of stages. The first stage is H1, which deploys code to a set of internal, Facebook-specific servers only accessible to the company’s engineers. If that deployment goes off without a hitch, that code gets pushed out to H2, a “few thousand machines serve a small fraction of real-world users,” according to a Facebook research paper. If that code works as expected, it gets pushed to H3—full deployment across all of Facebook’s servers.

Advertisement

4. DCI

“Destructive conflict,” or DCI, is a label generated within the company that’s meant to flag “uncivil” or abusive conversations between users in an automated way.

Advertisement

5. Eat Your Veggies

The protocol that Facebook employees are expected to follow around “major [or] sensitive” updates to people’s News Feeds.

Advertisement

6. Blame Tool

This is an internal tool Facebook’s researchers can use on a given “bad post” to see what kind of on-platform triggers caused it to bubble up in a person’s feed.

Advertisement

7. Blackhole

Another internal tool used by researchers to blacklist any URLs, domains, or IP addresses that are associated with spam or otherwise icky content. Blackhole attaches different labels (like “OK,” “BAD,” or “IFFY”) to each of these elements, and each label has a corresponding effect to how that URL/domain/IP can be seen and shared across Facebook as a whole.

Advertisement

8. FUSS

This stands for “Facebook Unified Signal Sharing/Feed Unified Scoring System. Internally, the Integrity team would classify posts on people’s newsfeeds under different FUSS categories depending on the “quality” of that given entity. Low-quality posts were labeled “FUSS Red,” “borderline” content was labeled “FUSS Yellow,” and regular, high-quality posts were “FUSS Green.” The research team also ran an experiment known as “FUSS Black,” which was their attempt to filter out as much Red and Yellow content from a given feed as possible.

Advertisement

9. Hex

The team’s internal term to refer to “human exploitation,” or human trafficking.

10. Banhammer

A tool used internally to remove all of the likes or follows from a given Facebook user, or group of Facebook users. One use-case brought up internally for the Banhammer was cutting out all the likes/follows from a user after they’d been banned from the platform.

Advertisement

11. Yoda

An in-house text processing tool used to sift through people’s posts, at scale. Supposedly named after the funny-talking Green alien man of the same name.

Advertisement

12. VPV

This stands for “Viewport Views.” This is a pretty foundational metric that Facebook employees use to calculate how often a piece of content—a post, a video, someone’s Story—was actually viewed by a given number of Facebook users. “Viewport,” in this case, refers to your laptop or phone screen. One viewport view = one entity, fully loaded, on that screen and in front of your eyeballs.

Advertisement

13. USI

“Unwanted social interactions,” or USI, can include harassing messages, unwanted Friend Requests, or really any sort of reach-out from another Facebook user that another user doesn’t like.

Advertisement

14. TRIPS

Stands for “Tracking Reach of Integrity Problems.” TRIPS was a foundational internal survey meant to measure what users think of the content they’re seeing on the platform. TRIPS tracks the prevalence of hate speech and harassment that users came across, but also the content that the Integrity team determined might be of “civic value.” At the end of the day, this sort of tracking is “meant to improve the quality of civic conversations” on the platform.

Advertisement

15. SUMA

“Single User Multiple Accounts,” or SUMA, refer to sockpuppet accounts used to manipulate conversations on Facebook. ‘Nuff said.

Advertisement

16. Shield

This is the internal program that either adds speedbumps to any efforts to crack down on a particular piece of content, or completely prevents any attempts to crack down on that content. Shield was specifically implemented for Facebook pages belonging to celebrities or public figures, for example, in order to prevent one of their algorithms from automatically pulling one of their posts—a move that would undoubtedly spell a PR nightmare for Facebook, if that public figure happened to notice.

Advertisement

17. SEV

Short for “Site Event,” SEV is what the company calls a platform-wide issue that affects overall Facebook service. Think the recent 6-hour outage of Facebook, Instagram, and WhatsApp.

Advertisement

17. ROPS/RODS

These acronyms refer to “Repeat Offender Pages,” and “Repeat Offender Domains,” which means said page or domain committed at least three offenses (or platform violations) during a 90 day period.

Advertisement

18. (P)rating

Another complicated mathematical model! This one’s used to predict how the company’s in-house team of professional News Feed Raters would rank the content on a given feed. The example given in Facebook’s internal glossary is “how good” a particular story might be, on a 1-5 rating scale.

Advertisement

19. Orb

An in-house search tool specifically geared towards sniffing out spam attacks on the Facebook platform.

Advertisement

20. Bouncer

An internal tool that the Integrity team used in order to crack down on “relatively small” lists of pages or people. Because we’re talking about a company with Facebook’s scale, “small” in this case means “on the order of thousands,” according to an internal document.

Advertisement

21. Blue

How researchers refer to the main Facebook app, which is... blue. “Blue time,” which is an example they give, refers to the total amount of time someone spends on said blue app. Makes sense!

Advertisement

22. Magnet User

This is the term the Integrity team used when talking about a given Facebook user who’s “hyper-engaged” with bad content.

Advertisement

23. ACDC

The algorithm that classifies the clusters of groups that get produced by the company’s other algorithms. Confusing, right? In this case, it just means that if one algorithm catches a bunch of (potentially sockpuppet-y) accounts sharing a single URL, ACDC is the algorithm that classifies this cluster of (spammy) accounts as sharing that single URL.

Advertisement

24. Faux-tire

Literally fake satire. The glossary defines the term as “material meant to misinform/push propaganda,” while actively portraying itself as satire, in order to weasel out of the company’s fact-checking systems. For an idea of what this kind of content looks like, look no further than Alex Jones’s lawyers, who famously described Infowars as an outlet specializing in “humor, bombasity, sarcasm [and] wit.”

Advertisement

25. NFX

The acronym the company internally uses when referring to the steps Facebook users take in order to report bad stuff cropping up on their feed. Stands for “Negative Feedback eXperience” (yes, really).

Advertisement

26. NCII

Stands for “Non-Consensual Intimate Imagery.” This is colloquially known as revenge porn, but that term is considered inaccurate and harmful to victims.

Advertisement

27. HERO

HERO refers to an internal “High-Risk Early Review Operations” program that’s meant to predict which posts might go viral across the platform on any given day. Used to catch potentially harmful viral-posts-in-the-making before they actually go viral.

Advertisement

28. NSFA

Depending on the context, this might refer to one of two acronyms that discuss one of two kinds of content: “Not Safe For All” (meaning that content isn’t family-friendly), or “Not Safe For Ads,” meaning that the content violates something in Facebook’s policies for advertisers.

Advertisement

29. MSI

Short for “Meaningful Social Interactions.” Internally, this is what employees referred to as “the goal metric” for people’s News Feeds. The company’s definition of “meaningful” is a bit of a moving target—other internal documents note that the different pieces that make up MSI change frequently, the same way a person’s understanding of what “meaningful” means might change over time.

Advertisement

As of 2020, the Integrity team was using metrics like the number of Likes and reshares a post received, along with the number of sticker comments (yes, sticker comments) people left under a given post to gauge its meaningfulness.

29. MAD

Short for “Mark As Disturbing,” MAD refers to content that might be reported by users (and flagged by Facebook’s content moderators) for, you guessed it, being “disturbing.” Frequent offenders, according to other Facebook documents that we’ve reviewed, include “borderline nudity,” “gross medical videos or wounds,” and content that tries “minimizing or denying [the] Holocaust.” (It’s unclear whether Mark Zuckerberg signed off on that last one.)

Advertisement

30. WAP

I know what you’re thinking. But in this context, it stands for “Weekly Average People,” or the number of Facebook users that do some sort of action in a given week. Which brings us to...

Advertisement

31. NAWI

AKA “Non-Abusive WAP Impact.” An internal score that Facebook’s researchers use when monitoring so-called “non-abusive” accounts, meaning that they were flagged for whatever reason but were confirmed to be “benign” after review.

Advertisement

32. VICN

This stands for “Violence Inducing Conspiracy Network.” The definition for what these “networks” look like is yet another moving target. Here’s how one of the leaked docs talks about VICN, in the context of two major Facebook groups (since removed) that were tied to the January 6 Capitol riots:

Over time, we have also increasingly found that not all harmful networks are so clear cut. With VICNs, we realized that not all harmful coordination is driven by an organization with command and control. With Stop the Steal and Patriot Party, we were able to designate movements for coordinating harm, though our interventions stopped short of treating either as a full network.

Advertisement

In this instance, it took a while for the company’s researchers to catch onto the fact that they were actually dealing with a “full network” of pages and people that were dedicated to an “adversarial harmful movement.” At least the documents show that these teams learned something from the event.

33. CAU

Short for “Care About Us,” CAU is used as a measure for the affinity Facebook’s users feel about the platform (i.e., how much they feel Facebook “cares about us”). Facebook even has an internal task force dedicated to boosting people’s CAU—the Protect and Care team. Products that this team worked on include Facebook’s suicide and self-injury prevention tools, which internal documents note were the result of a close collaboration “with a team of experts in suicide prevention and policy.” The same document notes that these experts also helped introduce folks working on CAU “to people who’ve experienced suicidal thoughts or attempted suicide,” so they could take those experiences into account when designing these tools.

Regardless of your opinion on Facebook as a whole (and whether the company actually cares about any of us), just know that this team does important work that we should all be thankful for.

Advertisement

(The National Suicide Prevention Hotline in the U.S. is available 24 hours a day at 1-800-273-8255. A list of international suicide hotlines can be found here.)

34. BTG

Short for “Break The Glass.” This is a term used when talking about a hellish event (like the aforementioned Capitol riot) that necessitates harsher moderation practices than the company’s usual tactics. Banning Stop The Steal groups, for example, was internally referred to as part of Facebook’s “BTG Response.”

Advertisement

35. Cannibalism 

A grisly sounding term for what’s pretty common in the business world: growing one product at the expense of another. You can see a pretty clear example with Instagram, which has been cannibalizing users from Facebook’s big blue app for the last couple of years

Advertisement

The above list is only a brief selection of the terms Facebook uses internally. For a bigger picture, see the full glossary provided by Haugen below. (This document has been redacted to exclude the names of employees and external researchers who are not part of Facebook leadership.)

Advertisement

This story is based on Frances Haugen’s disclosures to the Securities and Exchange Commission, which were also provided to Congress in redacted form by her legal team. The redacted versions received by Congress were obtained by a consortium of news organizations, including Gizmodo, the New York Times, Politico, the Atlantic, Wired, the Verge, CNN, and dozens of other outlets.

Advertisement