A former TikTok moderator is suing the company, claiming it failed to protect her mental health after “constant” exposure to traumatic video content.
Candie Frazier says she reviewed videos that featured “extreme and graphic violence” for up to 12 hours a day.
She says she suffers from “significant psychological trauma”, including anxiety, depression, and post-traumatic stress disorder.
TikTok says it strives to promote “a caring working environment”.
In September TikTok announced 1 billion people were using the app each month. It now has more hits than Google, according to Cloudflare, an IT security company.
To protect its users, the video-sharing platform uses thousands of in-house and contract content moderators to filter out videos and accounts that break its rules.
Ms Frazier is suing both TikTok and its parent company, Chinese tech-giant Bytedance.
She claims that in her role as a moderator she watched graphic content, including videos of sexual assault, cannibalism, genocide, mass shootings, child sexual abuse, and the mutilation of animals.
Ms Frazier, who worked for a third-party contractor, Telus International, says she was required to review hundreds of videos a day.
According to the lawsuit filed with a federal court in California last week, Ms Frazier suffered “significant psychological trauma, including anxiety, depression, and post-traumatic stress disorder” because of the material she was required to review.
The lawsuit claims that while she was not a TikTok employee, the social-media giant “controlled the means and manner in which content moderation occurred”.
Ms Frazier alleges that in order to handle the volume of content, she was expected to review, she had to watch as many as 10 videos simultaneously.
In the lawsuit it is claimed that during a 12-hour shift moderators were permitted a 15-minute break after the first four hours of work, and then a 15-minute break every subsequent two hours. In addition there was a one-hour lunch break.
It alleges that TikTok failed to meet industry standards designed to reduce the impact of content moderation, and that the firm violated California labour law by not providing a safe work environment.
TikTok would not comment on the “on-going” case, but the firm did say it strived to “promote a caring working environment for our employees and contractors”.
The company added: “Our safety team partners with third-party firms on the critical work of helping to protect the TikTok platform and community, and we continue to expand on a range of wellness services so that moderators feel supported mentally and emotionally.”
TikTok believes its measures to protect moderators are in line with industry best practice.
Last year, TikTok was among a coalition of social-media giants that created guidelines to safeguard employees who have to filter out child sex-abuse imagery.
Telus International, which is not a defendant in the case, said it had robust mental-health programmes in place and told the Washington Post, its employees could raise concerns through “several internal channels” – something it claimed Ms Frazier had not done.
The firm told the paper Ms Frazier’s allegations were “entirely inconsistent with our policies and practices”.
In 2020, another social-media goliath, Facebook, agreed to pay out $52m (£39m) in compensation to moderators who had developed PTSD as a result of their job.