https://cdn.arstechnica.net/wp-content/uploads/2020/05/GettyImages-885241328.jpg
Google CEO Sundar Pichai speaks in Wuzhen, China, in 2017.
Du Yang/China News Service/Visual China Group via Getty Images

YouTube ducks questions about “error” that nixed anti-Beijing comments

YouTube would only say its classifiers didn't consider "the proper context."

by

YouTube says it has "rolled out a fix" for an "error in our enforcement systems" that had led to the automatic deletion of comments that included two phrases critical of China's government. But in an email exchange and phone call with Ars Technica, a company spokeswoman declined to provide real details about why YouTube's software was deleting the comments in the first place.

As I explained on Tuesday, "共匪" means "communist bandit." It was a derogatory term used by Nationalists during the Chinese Civil War that ended in 1949. It continues to be used by Chinese-speaking critics of the Beijing regime, including in Taiwan.

"五毛" means "50-cent party." It's a derogatory term for people who are paid by the Chinese government to participate in online discussions and promote official Communist Party positions. In the early years of China's censored Internet, such commenters were allegedly paid 50 cents (in China's currency, the yuan) per post.

Until Tuesday, YouTube was automatically deleting any comment that includes these phrases. I confirmed the behavior myself on Tuesday morning. Comments containing either phrase would disappear in less than a minute, while other comments—including ones containing other Chinese phrases—stayed on the site.

Users have been reporting this behavior since late last year, with little response from YouTube. That changed on Tuesday when high-profile news sites—starting with The Verge—began covering the story. Within 24 hours of The Verge story appearing, YouTube had fixed the error.

And YouTube says that it was an error, not a deliberate policy decision. But not everyone is convinced.

"This purported 'error' follows a long, disturbing pattern of Google censoring content to try to gain favor with the Chinese Communist Party," Sen Josh Hawley (R-Mo.) wrote in a Wednesday letter to Google CEO Sundar Pichai. Hawley is one of many who suspect this was a deliberate policy decision—not just an innocent mistake.

The case for transparency

On Wednesday, I exchanged emails and talked on the phone with a YouTube spokeswoman. She seemed eager to help but wasn't able to offer me much detail. She said that YouTube relies on classifiers to decide which comments to delete and that YouTube's classifiers didn't take into account "the proper context." She said she wasn't able to provide more detail than that.

I'm sure this wasn't her fault. In a big company like Google, decisions about what to tell the press are made several levels up from the people who actually talk to reporters like me. But I think Google as a company is making a mistake by being so secretive about this.

It seems plausible that there is an innocent explanation for YouTube's mistake. For example, maybe the phrases "五毛” and "共匪" appear frequently in heated arguments that include other abusive (but less political) Chinese phrases. It's easy to imagine an algorithm classifying them as abusive without appreciating the political ramifications of doing so—and without any of Google's human employees realizing it.

Alternatively, maybe people on the Chinese government's payroll figured out how to game YouTube's comment-moderation rules by flagging millions of comments critical of the Chinese government. Or maybe a low-level employee with Chinese government sympathies slipped the phrases into a list of banned phrases without the knowledge or approval of their bosses. It would be an easy thing to overlook in a company where the leadership mostly doesn't read Chinese.

Any of these explanations seem both more plausible and less damning than senior Google executives deliberately choosing to censor phrases to curry favor with Beijing. But if Google refuses to be transparent about how and why this error happened, a lot of people are going to assume the worst.