Apple Fixes 'Asian' Adult Content Filter, but We Need More

We may earn a commission from links on this page.
Image for article titled Apple Fixes 'Asian' Adult Content Filter, but We Need More
Photo: Victoria Song/Gizmodo

In an ideal world, I wouldn’t have had to write a story about how Apple’s adult content filter on iOS blocks users from searching the word “Asian” in any browser. In that world, I also wouldn’t have to write the follow-up the issue is now fixed, but that the timingshortly after a mass shooting in Atlanta that left eight people dead, six of whom were Asian women—is deeply problematic.

According to Mashable, the latest iOS 14.5 beta removes the “Asian” filter. Gizmodo has independently confirmed that this is the case. This is objectively a good thing. What’s not good is that this issue, according to Mashable, was reported to Apple on Dec. 7, 2019 by iOS developer Steven Shen. After more than a year of inaction, Shen took to Twitter in early February to raise public awareness of the issue, which Gizmodo and other news media covered. Gizmodo’s requests to Apple at the time for a statement went unanswered. Shen also reportedly told Mashable while Apple never officially responded to his initial warning, an Apple employee did verify the problem on Twitter and “filed the issue internally.”

Advertisement
Advertisement

What this means is this filter was in place before the first public reports of covid-19 in Wuhan, China. It means it was brought to Apple’s attention well before the phrases “kung flu” or “China virus” were ever uttered. It was in place as the AAPI community tried to raise awareness over a spike of hate crimes fueled by the pandemic. It was functional well after the issue finally started picking up steam in the media after the killing of Vicha Ratanapakdee, an 84-year-old Thai immigrant who was murdered in San Francisco—a city about 50 minutes from Cupertino. It is still technically functional today, a day when yet another elderly Asian-American woman was attacked in New York City as bystanders did nothing. You need to have downloaded the iOS 14.5 beta, after all, to get the fix.

Advertisement

Since publicly rolling out in September, there have been several updates to iOS 14. Since the issue gained public attention, there have been two updates. It’s still unclear how exactly Apple’s adult filters were determined, and whether there was human oversight or if this an example of AI’s inherent weaknesses in tagging and filtering content. Apple has yet to comment or address why “Asian” was the sole racial search term that was filtered for adult content, or whether fixing this issue was a priority once it became aware. Perhaps this is an extremely difficult thing to fix, requiring several of Apple’s most brilliant minds, and the timing was unfortunate given the uptick in anti-Asian hate crimes. I don’t know. I’ve reached out to Apple for more context and a statement, but have yet to hear back. That said, my gut feeling is that this isn’t a difficult task. It just wasn’t deemed a particularly important or urgent one.

I wish I could say that, as an Asian-American tech journalist who frequently reviews Apple products, the thought makes me angry. The reality is I am just so sad, and so tired.

Advertisement

It especially hurts in the wake of Atlanta because, as Gizmodo originally reported, this filter wasn’t even effective. If the intent was to block inappropriate content, it could be easily outsmarted with a few workarounds. You could search “Japanese schoolgirls” and see several images that parents would object to. But searching “Asian-American history” or “Stop Asian Hate” would turn up a message that you were trying to access inappropriate content. Ostensibly, this filter was meant to protect children from seeing pornography. What I can’t stop thinking about is how a child could innocently search for facts about an Asian elephant and then see a message that plants the idea that somehow anything “Asian” is adult content.

This is most certainly not what Apple intended, but it doesn’t erase the sting. It doesn’t change the fact that the way Apple handled this mirrors how Asian-American pleas to Stop Asian Hate went unheard for over a year. It underscores just how normal it is to hypersexualize Asian women, which to be clear, is both racist and misogynist. It only magnifies the normalized racism and misogyny of the Atlanta shooter, who targeted Asian massage parlors to remove the “temptation” of his alleged “sex addiction.”

Advertisement

What’s done is done. Apple can’t go back in time, wave a magic wand, and pretend this never happened. Apple should’ve fixed this issue when it was first raised, but saying this feels empty and hollow. Lots of people and companies in positions of power and influence should have and could have done something sooner. They didn’t, and bemoaning that does nothing but rub salt in the wound.

The one thing that Apple absolutely should not do is stay silent in the hopes that quietly fixing this issue will limit how many people know this even happened. The AAPI community has been gaslit enough. Instead, it could own up to its mistake and outline how it intends to ensure that this sort of thing never happens again. Perhaps I’m wrong, but this doesn’t seem like a huge ask. But then again, neither did fixing the filter.

Advertisement