Cruel reason users hidden on app

Video-sharing app TikTok "instructed moderators to suppress posts created by users deemed too ugly, poor or disabled for the platform," according to leaked documents obtained by The Intercept, an online publication known for investigations and analysis.

Some videos that were reportedly censored featured serious subjects such as military movements, natural disasters, "defamed civil servants" and material threatening to national security. But other kiboshed videos were ones showing more innocuous images related to "fat" people, run-down houses, "rural poverty, slums, beer bellies and crooked smiles."

The controls were reportedly put in place to ensure "rapid growth in the mould of a Silicon Valley start-up while simultaneously discouraging political dissent with the sort of heavy hand regularly seen in (TikTok's) home country of China."

 

This flies in the face of the lighthearted vibe of the viral video-sharing site - whose 800 million monthly users often post "challenges" re-creating choreographed dances and lip-synched songs - as "a global paragon of self-expression and anything-goes creativity".

Some of the instructions for moderators were very specific, such as: "Scan uploads for cracked walls and 'disreputable decorations' in users' own homes," and then slam those users with "algorithmic punishments" that would "artificially" reduce the size of their audiences.

TikTok countered that the reported policies were either out of date or never formally in use, but The Intercept found the guidelines were followed in some form until at least late 2019.

The policy documents, which were written in Chinese and English and had never been revealed before, were corroborated by "conversations with multiple sources directly familiar with TikTok's censorship activities".

Moderators "were also told to censor political speech in TikTok livestreams, punishing those who harmed 'national honour' or broadcast streams about 'state organs such as police,'" by banning them from the platform altogether.

A list of reasons that a user's video clip might not be selected for the "For You" section of the app - which exposes it to a much wider audience - include "abnormal body shape," "ugly facial looks," dwarfism, "eye disorders" and "too many wrinkles."

"Most of (these livestream guidelines) are either no longer in use, or in some cases appear to never have been in place," TikTok spokesperson Josh Gartner told The Intercept. Mr Gartner added that the rules around suppressing videos with unattractive, disabled or poor users "represented an early blunt attempt at preventing bullying, but are no longer in place, and were already out of use when The Intercept obtained them."

"Sources indicated that both sets of policies were in use through at least late 2019," The Intercept wrote.

This article originally appeared on the New York Post and was reproduced with permission