Senator Markey, Representatives Castor and Trahan urge FTC to use its authority to force tech companies to comply with new platform policies


In response to the new UK Child and Adolescent Privacy Act, popular apps and websites recently announced significant changes to their official policies for young users.

Washington (October 8, 2021) – As big tech companies announced policy changes to protect young online users in response to a new UK child privacy law, Senator Edward J. Markey (D-Mass.) And Representatives Kathy Castor (FL -14) and Lori Trahan (MA-03) today wrote to the Federal Trade Commission urging the agency to use its full powers, including its authority in under Section 5 of the FTC Act, to ensure that these companies comply with their new policies. The Age Appropriate Design Code (AADC) went into effect in the UK in September and requires online services available to children and adolescents to adhere to 15 key child privacy standards, many of which are similar to legislative proposals aimed at update Senator Markey’s 1998 law, the Children’s Online Privacy and Protection Act (COPPA), in the United States.

“The need to protect young people from threats to online privacy is more urgent than ever. Since 2015, American children have spent nearly five hours a day staring at their screens, and the daily screen time of children and adolescents has increased by 50% or more during the coronavirus pandemic, ” lawmakers wrote in their letter. “We therefore encourage you to use all the tools at your disposal to carefully scrutinize companies’ data practices and ensure that they meet their public commitments.

A copy of the letter is available HERE.

In response to the AADC, Instagram publicly announced that it “defaults to forwarding young people to private accounts, making it harder for potentially suspicious accounts to find young people, [and] limit the options available to advertisers to reach young people with advertisements. Google and its affiliate YouTube have announced that they will “make product experiences suitable for children and teens” by changing the default video download setting for teens aged 13 to 17 to “private”; deactivate the position history (without the possibility of reactivating it) for users under the age of 18; and “block[ing] advertising targeting based on the age, gender or interests of those under 18 ”, among other changes. Last year – in the same vein before the enactment of the AADC – TikTok said it had disabled messaging for accounts under 16 and increased parental controls.

Lawmakers note, “These policy changes do not replace Congressional action on children’s privacy, but they are important steps in making the Internet safer for young users.”

Leave A Reply

Your email address will not be published.