The Federal Trade Commission on Wednesday proposed extensive changes to strengthen a key federal rule protecting children’s online privacy, in one of the US government’s most significant attempts to strengthen consumer privacy in more than a decade.
The changes are intended to strengthen rules under the Children’s Online Privacy Protection Act of 1998, a law that restricts online tracking of minors through services such as social media apps, video game platforms, retailers of toys and digital advertising networks. Regulators said the measures would “shift the burden” of online safety from parents to apps and other digital services while restricting how platforms can use and monetize children’s data.
The proposed changes would require some online services to turn off targeted advertising by default for children under 13. They will ban online services from using personal details such as a child’s cellphone number to encourage young people to stay on their platforms for longer. That means online services can no longer use personal data to bombard kids with push notifications.
The proposed updates would also strengthen security requirements for online services that collect children’s data as well as limit the length of time online services can retain that information. And they will limit the collection of student data by learning apps and other educational-tech providers, by allowing schools to consent to the collection of children’s personal details only for for educational purposes, not for commercial purposes.
“Children should be able to play and learn online without being endlessly tracked by companies looking to store and monetize their personal data,” said Lina M. Khan, the chair of the Federal Trade Commission, in a statement on Wednesday He added, “By requiring companies to better safeguard children’s data, our proposal places clear obligations on service providers and prohibits them from outsourcing their responsibility to parents.”
COPPA is the central federal law protecting children online in the United States, though members of Congress have tried to introduce broader online safety bills for children and teenagers since then.
Under the COPPA lawonline services aimed at children, or those who know they have children on their platform, must obtain parental consent before collecting, using or sharing personal details — such as first and last name, address and number of the phone — from a minor. of 13.
To comply with the law, popular apps like Instagram and TikTok have terms of service that prohibit children under 13 from setting up accounts. Social media and video game apps commonly ask new users to provide their dates of birth.
However, regulators have filed multiple complaints against big tech companies accusing them of not setting up effective age-gating systems; showing targeted ads to children based on their online behavior without parental consent; enabling strangers to contact children online; or keeping children’s data even if parents have asked for it to be deleted. Amazon; Microsoft; Google and its YouTube platform; Epic Games, the creator of Fortnite; and Musical.ly, the social app now known as TikTok, are all paying multimillion-dollar fines to settle charges that they broke the law.
Separately, a coalition of 33 state attorneys general filed a joint federal lawsuit in October against Meta, the parent company of Facebook and Instagram, alleging the company violated children’s privacy laws. In particular, the states criticized Meta’s age verification system, saying the company allowed millions of underage users to create accounts without parental consent. Meta said it has spent a decade working to make online experiences safe and age-appropriate for teenagers and the complaint states the company’s work is “misleading.”
The FTC proposed stronger children’s privacy protections amid heightened public concern over the potential risks to mental health and physical safety that popular online services may pose to children. youth online. Parents, pediatricians and children’s groups have warned that social media content recommendation systems regularly surface inappropriate content promoting self-harm, eating disorders and plastic surgery to young people. woman And some school officials worry that social media platforms are distracting students from their work in class.
States this year have passed more than a dozen laws restricting minors’ access to social media networks or pornography sites. Industry trade groups successfully sued to temporarily block some of those laws.
The FTC began reviewing the children’s privacy rule in 2019, receiving more than 175,000 comments from tech and advertising industry trade groups, video content developers, consumer advocacy groups and members of Congress. The resulting measure runs over 150 pages.
The proposed changes include narrowing an exception that allows online services to collect persistent identification codes for children for certain internal operations, such as product improvement, personalization consumer or fraud prevention, without parental consent.
The proposed changes would prohibit online operators from using such user-tracking codes to maximize the amount of time children spend on their platforms. That means online services can’t use techniques like sending mobile phone notifications “to prompt the child to interact with the site or service, without verifiable parental consent,” according to the proposal.
It is not yet known how online services will follow the changes. Members of the public have 60 days to comment on the proposals, after which the commission will vote.
The initial reactions of industry trade groups were mixed.
The Software and Information Industry Association, whose members include AmazonApple, Google and Meta, said it was “grateful” for the FTC’s efforts to consider outside input and that the agency’s proposal cited the group’s recommendations.
“We are interested in participating in the next phase of the effort and hope that the FTC will take a like-minded approach,” said Paul Lekas, the group’s head of global public policy, in an email.
NetChoice, where Members include TikTok, Snap, Amazon, Google and Meta, in contrast, said the agency’s proposed changes go too far by setting defaults that parents may not like. The group has sued several states to block new laws that would limit access to online services by minors.
“With this new rule, the FTC is overriding the wishes of parents,” Carl Szabo, the group’s general counsel, said in a statement. This will “make it even more difficult for websites to provide necessary services to children as approved by their parents.”