TikTok announced the expansion of its audience control function, enabling creators to limit their videos to adult audiences.
Before this expansion, the adult only audience control feature was only available for TikTok Live.
Now, the company also introduces this function into its short videos.
TikTok wrote in a blog post: "We have started to bring our audience control function to short video creators, and will expand this function to the world in the next few weeks.
To be clear, our policy is still fully applicable to creators who use this function, and we will delete content that contains nudity and other content that violates our community guidelines." Like the adult live broadcast on TikTok, The 18+age limit setting of videos is not a way for creators to display adult content, because these content are still subject to the application policy.
On the contrary, TikTok believes that this setting is a way for creators to prevent minors from contacting content that is targeted at adult audiences or may not be interested in them.
When TikTok launched a live broadcast service for adults only, the company said that this setting can be used for creators who want to share comedy content more suitable for people over the age of 18.
Or, the creator may want to talk about a difficult life experience, knowing that it is more comfortable to limit the conversation to adults.
The expansion of the audience control setting is because TikTok said before that it wants to start identifying which content is suitable for young and older teenagers, not adults.
TikTok once said that it is developing a system to identify and limit certain types of content to be accessed by teenagers.
It will start asking creators to explain when their content is more suitable for adult audiences.
We now see that the audience control function of the application is in practice.
TikTok also announced that it is rolling out the next iteration of its marginal suggestion model, which automatically identifies sexually exposed, suggestive, or boundary content.
The next iteration of TikTok's marginal suggestion model is expected to better detect such content.
These announcements are part of TikTok's broader push to enhance the youth safety features of its apps.
Last year, TikTok launched a "content level" to prevent some content with more mature or complex themes from reaching teenagers.
As part of these efforts, TikTok said that in the past 30 days alone, it has prevented more than 1 million videos with obvious sexual implications from being viewed by youth accounts.
Child and adolescent safety is an area where TikTok faces significant scrutiny, not only from regulators and legislators, but also from parents.
For example, last year a group of parents sued TikTok because their children imitated and died after trying the dangerous challenge they allegedly saw on TikTok.

give likes(0) Reward

Comment list 共有 0 条评论