TikTok Faces Class-Action Lawsuit Over Content Moderators’ ‘Unmitigated Exposure to Highly Toxic and Extremely Disturbing Images’

0 368
TikTok keeps crashingTikTok keeps crashing Photo Credit: Solen Feyissa

TikTok and its ByteDance parent company are officially facing a class-action lawsuit over their alleged failure “to implement workplace safety measures” for content moderators.

One Candie Frazier, a Las Vegas resident who’s worked as a TikTok content moderator since January of 2018, just recently submitted the class-action suit to a California federal court, naming as defendants both the short-form video-sharing app and Beijing-headquartered ByteDance.

The plaintiff has specifically accused the defendants of multiple counts of negligence (including “abnormally dangerous activity”) as well as violations of California’s Unfair Competition Law. But it bears noting at the outset that the filing party “has never been employed by ByteDance or TikTok in any capacity,” instead working for a company (and non-party to the action) called Telus International (NYSE: TIXT), per the suit.

And Telus International, the 35-page-long complaint proceeds, hires content moderators for the decidedly popular (and controversial) TikTok. However, these professionals “are constantly monitored by ByteDance and TikTok through their software,” the plaintiff maintains, including in terms of verifying “whether quotas are being met” and tracking time spent away from the computer.

“ByteDance and TikTok cause Telus to withhold payment to Content Moderators if they are not on the TCS application beyond their allotted breaks (two fifteen-minute breaks and one hour-long lunch break for a twelve-hour workday),” the lawsuit discloses, stating also that “low wages, short-term contracts, and the trauma associated with the work” cause “high turnover” among the video-review employees.

Interestingly, the text likewise expands upon the length of employees’ days, emphasizing that TikTok content moderators (who must sign non-disclosure agreements, according to the plaintiff) “are required to work long hours—an average of 12 hours a day.” Each flagged TikTok clip is sent to two content moderators, the document also relays, who then “determine if the video should remain on the platform, be removed from the platform, or have its audio muted.”

“Due to the sheer volume of content, content moderators are permitted no more than 25 seconds per video, and simultaneously view three to ten videos at the same time,” the plaintiff’s action reads.

Keeping in mind these seemingly demanding workdays and the massive number of involved uploads, Candie Frazier says that she “was exposed to thousands of graphic and objectionable videos,” with all manner of gruesome examples highlighted.

The content moderator has developed PTSD, depression, and more as a result of this exposure, the lawsuit claims, and TikTok’s “strict standards created stress and that such stress contributed to and exacerbated Content Moderators’ risks of developing psychological trauma.”

“ByteDance and TikTok could have, but failed to, implement safeguards on their Content Moderation tools—including changing the color or resolution of the video, superimposing a grid over the video, changing the direction of the video, blurring portions of the video, reducing the size of the video, and muting audio—that could mitigate some of the harm caused by reviewing graphic and disturbing content,” the document continues.

“Without any meaningful counseling or similar mental support, Plaintiff and other Content Moderators were ill equipped to handle the mentally devastating imagery their work required them to view,” the lawsuit says towards its conclusion, proceeding to claim that “ByteDance and TikTok heavily punishes any time taken away from watching graphic videos.”

Lastly, it’s alleged that the lawsuit’s class size “is in the thousands,” with these individuals suffering “ongoing harm resulting from ByteDance’s and TikTok’s wrongful conduct.” The plaintiff is seeking, among other things, damages, “a medical monitoring fund,” and the implementation of “safety guidelines for all prospective content moderation operations.”

Leave A Reply

Your email address will not be published.