The news: Video platform TikTok has published a set of new, more detailed guidelines governing which videos will be deleted from the app. It says it will take down videos promoting terrorism, crime, violence, hate speech or self-harm, for example.
The rules also ban “misleading information” that could cause harm to either an individual or the general public, going further than US competitors like Facebook which have (controversially) tried to avoid making those sorts of judgements. TikTok also explicitly bans denying “well-documented and violent events have taken place” like the Holocaust, while Facebook permits it.
What’s next: Writing the policies is the easy part—enforcing them is much harder, and TikTok hasn’t provided much detail on how it does that. However, German publication Netzpolitik revealed leaked moderation guidelines at the end of last year which showed how TikTok algorithmically suppresses certain videos from becoming popular by making them harder for users to find. Controversially, this included videos created by people with disabilities. TikTok also came in for criticism for banning a girl who had posted a video attacking the treatment of Uighur people by the Chinese state.
Real Life. Real News. Real Voices
Help us tell more of the stories that matterBecome a founding member
Plenty of wiggle room: In a blog post published yesterday, TikTok said the global guidelines “are the moderation policies TikTok’s regional and country teams localize and implement in accordance with local laws and norms,” which means it can choose what to suppress and what to promote country-by-country.
Sign up here to our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.
Subscribe to the newsletter news
We hate SPAM and promise to keep your email address safe