Tik Tok announces new Policy Changes to make the platform more “Safe and Secure”

On the heels of its first Congressional hearing for product safety, TikTok has announced policy changes aimed at making the short-form video social platform safer and more secure, particularly for minors, LGBTQ and minority users. In October, TikTok vice president and head of public policy Michael Beckerman testified along with executives from Snapchat and YouTube, addressing questions from U.S. senators as to the social media site’s impact on teen eating disorders and fallout from dangerous hoaxes. The policy updates address those concerns and institute new cybersecurity measures intended to protect user data from unauthorized access.

The changes include TikTok instituting age-appropriateness designations, clarifying policies around prohibited topics like hate speech, and expanding its focus on dangerous acts and challenges. “These updates clarify or expand upon the types of behavior and content we will remove from our platform or make ineligible for recommendation in the For You feed,” TikTok sternly warns in its news release.

The policy updates follow news that the Commerce Department is taking steps to mitigate security risks posed by foreign-owned social media platforms, including TikTok, which is owned by China’s ByteDance. As TechCrunch reports, this is all seemingly in response to the Commerce activities. TikTok says “it will open cyber incident monitoring and investigative response centers in Washington, D.C., Dublin and Singapore this year, as part of this expanded effort to better prohibit unauthorized access to TikTok content, accounts, systems and data.”

TikTok now specifies that “practices like deadnaming and misgendering, misogyny or content supporting or promoting conversion therapy programs will not be permitted. The company says these subjects were already prohibited, but it heard from creators and civil society organizations that its written policies should be more explicit,” reports TechCrunch. GLAAD, which collaborated with TikTok on changes, issued a statement from CEO Sarah Kate Ellis that says the platform “raises the standard for LGBTQ safety online” and sets an example that other platforms should emulate.

TikTok is aiming to create a system that restricts minors from accessing inappropriate content. Though the policy is still in development, it involves three parts: community members will be able to select which “comfort zones” — indicated by content maturity levels — they want the app to display; parents and guardians will be able to tap TikTok’s existing Family Pairing control feature to specify maturity level decisions on this on behalf of minors; and creators will be asked to label content suitable only for adult audiences.

Specific to dangerous acts and challenges, TikTok “says it already removes ‘eating disorder’ content, like content that glorifies bulimia or anorexia, but it will now broaden its policy to restrict the promotion of ‘disordered eating’ content,” according to TechCrunch, which notes the platform has recently updated its Safety Center “and other resources in the wake of upsetting, dangerous and even fatal viral trends, including ‘slap a teacher,’ the blackout challenge and another that encouraged students to destroy school property.”


Photo Credit: Ilina Yuliia / Shutterstock.com