DOJ and FTC Sue TikTok for Violating Children’s Privacy Laws

by Esmeralda McKenzie
Privacy / Data Protection


Aug 03, 2024Ravie LakshmananPrivacy / Data Protection

The U.S. Department of Justice (DoJ), along with the Federal Trade Commission (FTC), filed a lawsuit against popular video-sharing platform TikTok for “flagrantly violating” children’s privacy laws in the country.

The agencies claimed the company knowingly permitted children to create TikTok accounts and to view and share short-form videos and messages with adults and others on the service.

They also accused it of illegally collecting and retaining a wide variety of personal information from these children without notifying or obtaining consent from their parents, in contravention of the Children’s Online Privacy Protection Act (COPPA).

TikTok’s practices also infringed a 2019 consent order between the company and the government in which it pledged to notify parents before collecting children’s data and remove videos from users under 13 years old, they added.

Cybersecurity

COPPA requires online platforms to gather, use, or disclose personal information from children under the age of 13, unless they have obtained consent from their parents. It also mandates companies to delete all the collected information at the parents’ request.

“Even for accounts that were created in ‘Kids Mode’ (a pared-back version of TikTok intended for children under 13), the defendants unlawfully collected and retained children’s email addresses and other types of personal information,” the DoJ said.

“Further, when parents discovered their children’s accounts and asked the defendants to delete the accounts and information in them, the defendants frequently failed to honor those requests.”

The complaint further alleged the ByteDance-owned company subjected millions of children under 13 to extensive data collection that enabled targeted advertising and allowed them to interact with adults and access adult content.

It also faulted TikTok for not exercising adequate due diligence during the account creation process by building backdoors that made it possible for children to bypass the age gate aimed at screening those under 13 by letting them sign in using third-party services like Google and Instagram and classifying such accounts as “age unknown” accounts.

“TikTok human reviewers allegedly spent an average of only five to seven seconds reviewing each account to make their determination of whether the account belonged to a child,” the FTC said, adding it will take steps to protect children’s privacy from firms that deploy “sophisticated digital tools to surveil kids and profit from their data.”

TikTok has more than 170 million active users in the U.S. While the company has disputed the allegations, it’s the latest setback for the video platform, which is already the subject of a law that would force a sale or a ban of the app by early 2025 because of national security concerns. It has filed a petition in federal court seeking to overturn the ban.

“We disagree with these allegations, many of which relate to past events and practices that are factually inaccurate or have been addressed,” TikTok said. “We offer age-appropriate experiences with stringent safeguards, proactively remove suspected underage users, and have voluntarily launched features such as default screen time limits, Family Pairing, and additional privacy protections for minors.”

The social media platform has also faced scrutiny globally over child protection. European Union regulators handed TikTok a €345 million fine in September 2023 for violating data protection laws in relation to its handling of children’s data. In April 2023, it was fined £12.7 million by the ICO for illegally processing the data of 1.4 million children under 13 who were using its platform without parental consent.

The lawsuit comes as the U.K. Information Commissioner’s Office (ICO) revealed it asked 11 media and video-sharing platforms to improve their children’s privacy practices or risk facing enforcement action. The names of the offending services were not disclosed.

“Eleven out of the 34 platforms are being asked about issues relating to default privacy settings, geolocation or age assurance, and to explain how their approach conforms with the [Children’s Code],” it said. “We are also speaking to some of the platforms about targeted advertising to set out expectations for changes to ensure practices are in line with both the law and the code.”

Found this article interesting? Follow us on Twitter and LinkedIn to read more exclusive content we post.



Related Posts