This is the story of Li An, a pseudonymous former employee at ByteDance, as told to Protocol’s Shen Lu.
I wasn’t proud of it, and neither were my coworkers. But that’s life in today’s China.
It was the night Dr. Li Wenliang struggled for his last breath in the emergency room of Wuhan Central Hospital. I, like many Chinese web users, had stayed awake to refresh my Weibo feed constantly for updates on his condition. Dr. Li was an ophthalmologist who sounded the alarm early in the COVID-19 outbreak. He soon faced government intimidation and then contracted the virus. When he passed away in the early hours of Friday, Feb. 7, 2020, I was among many Chinese netizens who expressed grief and outrage at the events on Weibo, only to have my account deleted.
I felt guilt more than anger. At the time, I was a tech worker at ByteDance, where I helped develop tools and platforms for content moderation. In other words, I had helped build the system that censored accounts like mine. I was helping to bury myself in China’s ever-expanding cyber grave.
I hadn’t received explicit directives about Li Wenliang, but Weibo was certainly not the only Chinese tech company relentlessly deleting posts and accounts that night. I knew ByteDance’s army of content moderators were using the tools and algorithms that I helped develop to delete content, change the narrative and alter memories of the suffering and trauma inflicted on Chinese people during the COVID-19 outbreak. I couldn’t help but feel every day like I was a tiny cog in a vast, evil machine.
ByteDance is one of China’s largest unicorns and creator of short video-sharing app TikTok, its original Chinese version Douyin and news aggregator Toutiao. Last year, when ByteDance was at the center of U.S. controversy over data-sharing with Beijing, it cut its domestic engineers’ access to products overseas, including TikTok. TikTok has plans to launch two physical Transparency Centers in Los Angeles and Washington, D.C., to showcase content moderation practices. But in China, content moderation is mostly kept in the shadows.
I was on a central technology team that supports the Trust and Safety team, which sits within ByteDance’s core data department. The data department is mainly devoted to developing technologies for short-video platforms. As of early 2020, the technologies we created supported the entire company’s content moderation in and outside China, including Douyin at home and its international equivalent, TikTok. About 50 staff worked on the product team and between 100 to 150 software engineers worked on the technical team. Additionally, ByteDance employed about 20,000 content moderators to monitor content in China. They worked at what are known internally as “bases” (基地) in Tianjin, Chengdu (in Sichuan), Jinan (in Shandong) and other cities. Some were ByteDance employees, others contractors.
To continue reading this article, click here.
You must be logged in to post a comment.
Pingback: I Helped Build ByteDance’s Censorship Machine Machine Learning Times – The Predictive Analytics Times | Prometheism Transhumanism Post Humanism