[vc_row][vc_column]

[/vc_column][/vc_row]

TikTok Is Auto-Deleting Videos That Violate Its Terms – Here’s What’s Getting Ripped Down

0
TikTok auto-deleting content violations

Photo Credit: TikTok

TikTok will begin using algorithms for auto-deleting content that violates its user policy.

The ByteDance-owned social network is also making some changes to how it notifies users. TikTok says the new system is designed to help reduce the number of distressing videos that its human safety team must review. The moderation team will then be free to focus on areas like bullying, hate speech, and harassment.

TikTok is testing automatic deletion of content that violates its policies, like adult nudity, sexual activity, violent content, illegal activities, and regulated goods. The company has been experimenting with auto-deleting content violations outside the U.S. for months now. TikTok says the false positive rate of its algorithms is around 5%. Requests to review content removal appeals have remained consistent since automation has been introduced to the process.

TikTok wants to introduce more transparency to its moderating process by introducing algorithms.

The company will also report how many accounts it removes because they belong to users under the age of 13. TikTok has faced stiff fines from the FTC for violating child privacy laws like COPPA. TikTok is also making some changes to how it notifies users who have uploaded content that violates its guidelines.

“People will be notified of the consequences of their violations starting in the Account Updates section of their Inbox. There, they can see a record of their accrued violations,” the TikTok blog post reads. There are several categories of violations, including video, comments, direct message, hashtag, profile, sound, and live.

How TikTok Content Violations Work – Auto-Deleting

TikTok has also outlined how these consequences will work for repeat content violations. Upon the first violation, users will get a warning in-app unless the violation is one of zero-tolerance. Zero-tolerance policies include (but are not limited to) child sexual abuse and pornographic material.

After the First Violation

Users who continue to violate TikTok’s policies can expect to see the following consequences:

  • Suspended ability to upload a video, comment, or edit a profile for 24 to 48 hours
  • Restricted account to a view-only experience for 72 hours, up to one week
  • After several violations, a user will receive a ban warning

“We developed these systems with input from our US Content Advisory Council, and in testing them in the US and Canada over the last few weeks, over 60% of people who received a first warning for violating our guidelines did not have a second violation,” TikTok claims.

Leave A Reply

Your email address will not be published.