Utah AG Accuses TikTok of Knowing Minors Were Being Groomed on ‘Live’

1 day ago 6

TikTok is fighting wars on multiple fronts. Not only is it locked into a fight for its life with the federal government as it waits for its day before the Supreme Court next week, but it also has the Attorney General of Utah breathing down its neck. Bloomberg acquired a redacted version of a lawsuit filed by the state’s leading prosecutor that alleges TikTok knew that its Live streaming feature was a breeding ground for all sorts of illicit content and harmful behavior, including grooming children.

The lawsuit reveals two internal investigations that TikTok launched into the activity on its Live platform. The first, Project Meramec, found that there were underage users performing sexualized acts on livestreams, done in exchange for virtual gifts given to them by viewers.

At the time of the investigation, TikTok policy forbade users who were 16 years old or younger from broadcasting on Live, and it prevented users under the age of 18 from sending or receiving virtual gifts that could be redeemed for money. However, enforcement of that fell short: the company’s internal review found that 112,000 underage users hosted livestreams during one single month in 2022. On top of that, the company found that its algorithm was boosting sexualized content, so those underage streamers were likely being recommended to viewers. There’s no real reason to wonder why that was happening: TikTok gets a cut of every virtual gift purchased. Users who get more gifts also generate more revenue for TikTok.

The second internal investigation, dubbed Project Jupiter, looked into money laundering operations that were being carried out using TikTok’s livestreaming service. That probe found that some criminal operations were using TikTok Live to move money around, while others were selling drugs and illegal services in exchange for virtual gifts. Internal communications between TikTok employees showed conversations about how Live may have been used to fund terrorist organizations like the Islamic State.

TikTok’s investigation into underage users followed an investigation published by Forbes that found numerous examples of older male users enticing young women to perform sexual acts on TikTok Live in exchange for gifts. Leah Plunkett, an assistant dean at Harvard Law School, told Forbes it was “the digital equivalent of going down the street to a strip club filled with 15-year-olds.”

It’s far from the first time TikTok’s lack of moderation, particularly as it relates to content involving minors, has gotten the company into hot water. Back in 2022, the US Department of Homeland Security launched an investigation into TikTok’s handling of child sexual abuse material. Earlier this year, the Federal Trade Commission and Department of Justice sued the company for violations of the Children’s Online Privacy Protection Act, alleging that the company knowingly allowed underage users to create accounts and interact with adults on the platform.

TikTok is not the only social platform with a child predator problem. Last year, the Wall Street Journal reported that Meta was having issues removing pedophiles from Facebook and Instagram and that its algorithms were actively promoting and guiding users to child exploitation content. Twitter, under Elon Musk’s guidance, axed its moderation team in charge of monitoring child sexual abuse and saw networks of child pornography traders crop up on the platform while actively unbanning users who were booted for posting child exploitation content.

It’s possible that none of these platforms are good, actually.

Read Entire Article