In 2019, the U.S. Federal Trade Commission (FTC) fined Google a record-breaking $170 million for violating the Children’s Online Privacy Protection Act (COPPA) on YouTube.
The FTC alleged that Google collected personal information from children under 13 without parental consent, including viewing histories, device identifiers, and location data.
This information was then used to serve targeted advertising, violating COPPA’s regulations.
READ ALSO: 23 Best Cybersecurity YouTube Channels
Table of Contents
Settlement and Changes
A settlement was reached, requiring significant changes to YouTube’s practices:
- Clear labeling:Â All content aimed at children must be clearly labeled and treated as such, regardless of the viewer’s age.
- Limited data collection:Â Data collection and use on content made for kids will be limited to what’s necessary for the platform’s operation.
- No targeted advertising:Â Targeted advertising will be prohibited on content designated for children.
- Restricted features:Â Comments, notifications, and other features that could expose children to risks will be disabled on designated content.
Google’s Response
Google implemented these changes and created algorithms to identify “kid-rated” content. However, concerns persist about the effectiveness of these measures and the potential for loopholes.
Criticisms and Ongoing Scrutiny
Critics like Senator Ed Markey and FTC Commissioner Rebecca Slaughter argue that the fine and settlement are insufficient to protect children on YouTube.
They call for stricter enforcement and regulations. Regulatory bodies and child advocacy groups continue to monitor the situation closely.
Current Status (February 21, 2024)
- The fine and mandated changes remain in effect.
- Concerns about child privacy on YouTube persist.
- The evolving digital landscape with new technologies like the metaverse presents new challenges.
- Collaboration between tech companies, regulators, and advocates is crucial for creating a safer online environment for children.
Google Fined for YouTube Child Privacy Violation: Frequently Asked Questions
What happened?
In 2019, the U.S. Federal Trade Commission (FTC) fined Google a record-breaking $170 million for violating the Children’s Online Privacy Protection Act (COPPA) on YouTube.
The FTC alleged that Google collected personal information from children under 13 without parental consent, including viewing histories, device identifiers, and location data. This information was then used to serve targeted advertising, violating COPPA’s regulations.
What were the specific violations?
The FTC accused Google of:
- Collecting personal information from children without parental consent.
- Using this information to serve targeted advertising to children.
- Failing to clearly label content aimed at children.
- Not having adequate safeguards to protect children’s privacy.
What changes were made as a result of the fine?
As part of a settlement, Google agreed to:
- Treat all data from viewers watching children’s content as coming from a child, regardless of the user’s age.
- Limit data collection and use on content made for kids to what is necessary for the operation of the service.
- Disable targeted advertising and features like comments and notifications on content designated for children.
- Clearly label all content aimed at children.
Are there still concerns about child privacy on YouTube?
Yes, there are ongoing concerns about how effectively Google protects children’s privacy on YouTube. Critics argue that:
- The changes implemented are insufficient to fully address the problem.
- Google may still be collecting more data than necessary from children.
- The platform still exposes children to risks like targeted advertising and inappropriate content.
Regulatory bodies and child advocacy groups continue to monitor the situation closely, and the evolving digital landscape presents new challenges for child privacy protection.
Note: this was originally published in Septemeber 2019 but has been updated.
INTERESTING POSTS
Thanks for your article. It was helpful.