Meta, the parent company of Instagram, is under intense scrutiny as 33 states file a legal complaint accusing it of neglecting the privacy and safety of underage users on its platform. The lawsuit reveals alarming statistics, stating that Meta received over 1.1 million reports of users under the age of 13 since 2019. Shockingly, the company allegedly “disabled only a fraction” of these accounts, violating federal children’s privacy laws.
Persistent Privacy Violations
The complaint asserts that Meta continued to collect personal information, including locations and email addresses, from underage users without parental consent. This blatant disregard for privacy laws, if proven, could result in significant civil penalties, potentially costing Meta millions of dollars.
Open Secret within Meta
According to the legal filing, Meta’s awareness of millions of Instagram users under the age of 13 is characterized as an “open secret” diligently concealed from the public. Internal documents, emails, and employee chats presented in the complaint depict a company that not only knew about the underage users but allegedly failed to implement effective age-verification systems.
Failure in Age Verification
Meta is accused of consistently neglecting to prioritize robust age-verification mechanisms, opting for methods that allowed users under 13 to lie about their age and create Instagram accounts. The filing also alleges that Meta officials misled the public by claiming the effectiveness of their age-checking process in congressional testimony, despite the company’s awareness of the pervasive issue.
Instagram’s Targeting of Underage Users
Quoting internal communications, the complaint suggests that Instagram actively pursued and engaged with underage users. Instagram head Adam Mosseri’s acknowledgment in a company chat that “Tweens want access to Instagram, and they lie about their age to get it” raises concerns about the platform’s approach to younger demographics.
Meta’s Response and Defense
In response, Meta contends that the legal complaint misrepresents its decade-long efforts to create a safe online environment. The company accuses the states of using selective quotes and cherry-picked documents to undermine its work. Meta asserts that Instagram’s terms of use restrict users under 13 in the United States and claims to have measures in place to remove underage accounts once identified.
Privacy Allegations and Children’s Online Privacy Protection Act (COPPA)
The lawsuit revolves around the Children’s Online Privacy Protection Act (COPPA), a federal law from 1998. COPPA mandates that online services targeting children obtain verifiable parental consent before collecting personal information from users under 13. Violations can incur fines exceeding $50,000 per breach.
Meta’s Calculated Decision
The legal filing argues that Meta deliberately chose not to build systems to detect and exclude underage users due to the perceived importance of capturing this demographic for continued growth. This strategic decision, if proven, underscores the company’s alleged prioritization of business objectives over user safety and privacy.
Previous Privacy Violations and FTC Settlement
This isn’t Meta’s first brush with privacy concerns. In 2019, the company agreed to a record $5 billion settlement with the Federal Trade Commission (FTC) over charges of deceiving users about privacy controls. The current allegations of privacy violations against underage users compound Meta’s regulatory challenges.
Conclusion
As the legal battle unfolds, Meta finds itself at the center of a controversy that goes beyond financial penalties. The case raises fundamental questions about the responsibility of social media platforms in protecting the privacy and well-being of their youngest users. The outcome could have far-reaching implications for the regulation of online platforms and the safeguarding of children’s privacy on the internet.