Advertisement

TikTok is testing age restrictions for younger users' recommendations

The company wants to classify content based on "content maturity and thematic comfort zones."

SOPA Images via Getty Images

TikTok is beginning to test ways to age-restrict some types of content in its app amid a push to beef up safety features for teens. The work is in an early stage, but the company says the goal is to shield younger users from mature content or other potentially inappropriate videos.

TikTok, like other social media apps, has recently faced increased scrutiny for how it handles user safety, particularly for children and younger teens. At a Congressional hearing last fall, the company’s VP of Public Policy hinted that the app was working on new ways “to enjoy content based on age appropriateness.” Now, the company has shared a few more details of what that may look like.

TikTok, long known for its eerily precise recommendation algorithm, is working on features that would be able to classify content based on “content maturity and thematic comfort zones,” according to Tracy Elizabeth, TikTok’s global issue policy lead. “When the system is fully launched, content that we've identified as containing overtly mature themes could be restricted from teens,” she said during a briefing with reporters. “And for content that has less … mature themes, our community members are going to be able to choose the comfort zones or content maturity that they would prefer to skip or opt into.”

Elizabeth didn’t elaborate on how the company was determining a video’s maturity level, saying the work was in an “innovation phase.” But she said it could eventually resemble the ratings used for film, television and video games. “We know that there’s family-ish content, there’s teen-ish content, there’s adult-ish content,” she said. “What we’d like to do is … say ‘here you go: you can pick for yourself what is that category that you feel most comfortable with.’” She added that parents could also control these preferences for their children via TikTok’s “Family Pairing” settings.

Separately, TikTok is also working on a feature for creators that would allow them to indicate whether their videos are intended for adults or younger users. This could help further inform TikTok’s recommendations to ensure that more mature content stays out of the feeds of younger users.

While TikTok is now running a small test of the age restrictions, it could still be some time before the features are widely available, and Elizabeth noted they are still taking shape. “A lot of this we haven’t fully decided how we’re going to do it,” she said.

Outside of those features, TikTok also shared an updated set of community guidelines. Under the new rules, TikTok is giving suicide hoaxes and dangerous challenges its own section in an effort to make the policy more visible. The company is also expanding the type of content it bars under its eating disorder policy. The new rules will prohibit videos that promote “disordered eating,” like extreme calorie counting, short term fasting, overexercise and other “under-recognized signs of a potential problem.”

Finally, TikTok is also updating its rules to explicitly ban “deadnaming, misgendering, or misogyny as well as content that supports or promotes conversion therapy programs.” The company says it already removed these types of posts in the past, but that the rules weren’t specifically outlined in its public-facing community guidelines.