Suing Meta: What US prosecutors know about how the company harms children
Lawsuits were filed on Tuesday in 41 states, including one joint suit by 33 AGs, accusing the owner of Instagram and Facebook of ‘taking advantage of children’s pain.’ It is a step forward in the initiative to demand responsibility from the hitherto untouchable social media platforms
The legal crusade against Meta went one step further on Tuesday. The attorneys general of 41 states sued Meta, the parent company of Facebook, Instagram, WhatsApp and Messenger, for developing products consciously designed to engage children, even though the company argues that its social media platforms are safe for minors. This latest move adds to the cascade of lawsuits — 200 of which have been grouped in a class action lawsuit filed in April — that individuals and educational institutions have brought against several social platforms (Facebook and Instagram, as well as Snapchat, TikTok and YouTube) for negatively affecting the mental health of young people.
The coordinated action of these 41 states — 33 of which came together in a single, joint lawsuit — details in over 200 pages the reasons why Instagram (and, to a lesser extent, Facebook) is a harmful product for young people. In the lawsuit, the attorneys general explain that they are filing the case because the Meta’s “violations present a continuing harm, and the unfair acts and practices complained of here affect the public interest.”
“Meta has profited from children’s pain by intentionally designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem,” said Letitia James, the attorney general for New York, one of the states involved in the federal suit, after the lawsuit was filed. “Social media companies, including Meta, have contributed to a national youth mental health crisis and they must be held accountable.”
Nearly a third of American teenage girls had suicidal thoughts in 2021, up 60% from the previous decade, according to the Centers for Disease Control and Prevention. “Just like Big Tobacco and vaping companies have done in years past, Meta chose to maximize its profits at the expense of public health, specifically harming the health of the youngest among us,” Colorado Attorney General Phil Wieser said in a statement.
A Meta spokesperson regretted that “this path” (the legal one) has been chosen instead of “working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use.” The company is also upset that the lawsuit filed on Tuesday focuses solely on Meta and does not include other social media sites.
The latest legal blow against Meta was not without warning. The origin of the lawsuit dates back to early 2021, when the company made public that it planned to create Instagram Kids, a version of its popular social network aimed at children under 13 years of age. The announcement caused a stir in the U.S. Several civil associations publicly protested and a group of 44 attorneys generals sent an open letter to Meta’s CEO Mark Zuckerberg, asking him to reconsider the idea.
In September of that same year, whistleblower Frances Haugen leaked internal documents, which gave further weight to the suspicious against Meta. The former Facebook employee revealed to The Wall Street Journal internal documents that showed that company executives knew of the harmful effects that Instagram had on young people, particularly adolescent girls. Despite the fact that its own reports said that Instagram caused eating disorders and had even led some users to suicide, the social network’s senior officials did nothing to address the situation.
In November, the significance of these findings led three states to open a formal investigation into the potential negative impact of Instagram among young people. Haugen’s revelations also led dozens of parents to file their own lawsuits against Meta, alleging that the company had affected the health and even physical integrity of their children. This process culminated in April this year with a class action lawsuit that also included several educational institutions. “Our case doesn’t just look at the content of the platforms: we refer to the very design of social networks, which are designed to be addictive,” Joseph VanZandt, the lawyer coordinating the class action lawsuit, explained to EL PAÍS. A hearing is scheduled to be held on Friday in the Northern District of California. It could play a decisive role in how the latest lawsuit plays out.
In 2022, amid the rise of TikTok — the fastest-gowing social media platform among young people — a group of attorneys general from more than 40 states initiated a separate investigation into the potential harmful effects of TikTok among young people. They have not yet presented their conclusions.
In recent years, countries such as the United Kingdom and states such as California and Utah have approved regulations that require social media companies provide more privacy and security for children. In the case of Utah, the use of social media is restricted at night. In London, Instagram and Pinterest were accused last year of having caused the death of a teenager who committed suicide after long exposure to those platforms.
Change of strategy
The lawsuits filed by 41 states against Meta reflect a change in strategy with respect to the class action suit filed by U.S. educational institutions and individuals. “If the latter emphasized that social networks negatively affect the mental health of young people, the lawsuit filed this week alludes to local trade and consumer laws, as well as federal laws for the protection of minors’ privacy and personal data,” explains Rodrigo Cetina, professor of Law at the Barcelona School of Management, who is also an expert in U.S. law.
Although the lawsuit points out that the products (social media apps) can manipulate people and are addictive by design, the argument put forward is that this is not compatible with commercial laws. “The large fines imposed in the United States for attacking privacy are not so much based on privacy as a fundamental right, but rather as violating the right to consumer protection,” says Cetina.
The change of course is no accident. In May this year — after the class action suit from educational institutions was filed — two key rulings exempted social media platforms for being responsible for the content shared on them. That prompted attorneys general to shift the focus of the lawsuit from something intangible — such as the negative effect social networks can have on mental health — to something more tangible: the fact that the company deceived children and parents by telling them that it would have no impact and that their data would be protected.
The new lawsuit also focuses on a single company: Meta. This is presumably because prosecutors have the most solid incriminating evidence or information against it, including the internal documents leaked by Haugen. Cetina suggests there may be another reason why Meta was singled out: “The redacted parts of the lawsuit, which are confidential, suggest that the plaintiffs have had the help of a protected witness who has provided information about the case that could compromise Meta.” In other words, the prosecutors may have found another whistleblower.
Problematic aspects
The lawsuit argues that Instagram and Facebook break the law and are dangerous for children for several reasons. These are the main ones:
Deception. The central argument is that “Meta has frequently been ‘breaking’ the mental health, well-being, and trust of its youngest users,” and knowingly and intentionally violating laws on consumer protection and data collection of minors. Meta, the text of the lawsuit says, has misled the public about the substantial dangers of social media apps and has chosen to ignore the harm they cause to the mental and physical health of young people.
Business model. Meta has created a business model for Facebook and Instagram that is designed to maximize the time that young people invest in these services and the attention they dedicate to the platforms. “Defendants intentionally designed their social media platforms to manipulate children into excessively and compulsively using them,” the lawsuit states.
Addictiveness. According to the lawsuit, to fulfill that business model, Meta has implemented functions such as “disruptive alerts, infinite scroll, autoplay, features promoting ephemeral content, and Reels are unfairly utilized by Defendants to extract additional time and attention from children whose developing brains are not equipped to resist those manipulative tactics.”
It continues: “Defendants design, develop, and deploy disruptive audiovisual and vibrating notifications, alerts, and ephemeral content features in a way that exploit children’s psychological vulnerabilities and cultivate a sense of ‘fear of missing out’ to induce children to spend more time on Meta’s platforms than they would otherwise.”
Slot machine effect. “By algorithmically serving content to children according to variable reward schedules, Meta manipulates dopamine releases in its child users, inducing them to engage repeatedly with its products — much like a gambler at a slot machine,” the lawsuit argues..
Persistence. Despite its own investigations, the analysis of independent experts and public data, Meta turns “a blind eye to these damaging effects and persist in exploiting children’s psychological vulnerabilities. Defendants’ acts and omissions constitute knowing decisions causing unnecessary and unjustified harm to children for Defendants’ financial gain.”
Personal data. Another key to the lawsuit concerns privacy regulation. According to the lawsuit, Meta has been collecting data on children under the age of 13 without obtaining the necessary parental consent, a direct violation of U.S. federal law. The complaint says that Meta refuses to limit the collection and use of children’s personal data even though the law prohibits them from doing so. It adds that the company has done nothing to obtain parental consent to collect and monetize the personal data of minors.
Expansion of the model. Finally, the company is accused of expanding the use of these illegal and harmful practices to other products and platforms, such as WhatsApp, Facebook Messenger and the metaverse.
Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition