Skip to content
_
_
_
_

So addictive that it leads to trial: Social media takes the stand

A dozen hearings in Los Angeles are trying to determine whether platforms like Facebook aim to become addictive for young people — and whether that is a crime

iPhone

Several trials are marking a social and legal milestone in U.S. history. For the first time, a judge has put the major tech companies that own social media platforms on the bench to determine whether they create addiction among young people, whether they are as dangerous as the tobacco industry, and whether they require far more comprehensive regulation.

Families of children addicted to social media and youth organizations have decided to take legal action. As a result, Meta (owner of WhatsApp, Facebook, Instagram, and Threads), TikTok, Snap (parent company of Snapchat), and YouTube (owned by Google) will face several lawsuits throughout this year. The first of at least a dozen trials scheduled to take place in Los Angeles began on Tuesday. It is estimated that there are more than 2,500 cases, at both the state and federal levels, involving families, associations, and school districts concerned about children’s mental health. Nine more trials are expected this year based on individual complaints, and a tenth involves an entire school district, Oakland (near San Francisco), as the plaintiff. According to the plaintiffs, social media is a public problem that imposes a massive burden — social, educational, and economic — when it comes to addressing youth addiction. And they have decided to take the matter to court.

The young Californian K. G. M. (known only by her initials), now 20 years old, is the first of these plaintiffs and the one many are watching closely. Her case is that of a girl as ordinary as so many others: she watched YouTube videos from the age of six and began uploading content at eight, joined Instagram at nine (when she got her first iPhone), TikTok (then Musical.ly) at 10, and Snapchat at 11. Today, she has gone through depression, anxiety, and body dysmorphia issues. Her lawyers argue that beauty filters, the algorithm, autoplay videos that start without pressing Play, and many other elements of the refined design of these apps — created to keep users scrolling for hours and which, they claim, lead to bullying, sexual harassment, and even suicide — were decisive in creating a social media addiction that resulted in serious problems.

“I wish I never downloaded it,” K. G. M. once told her sister about Instagram, according to the lawsuit. The same sister recounts that if their mother tried to take K.’s phone away, she would have an emotional breakdown from not being able to access Instagram during that time, “like someone had died,” she recalled.

“There became a point where she was so addicted that I could not get the phone out of her hand,” her mother said. “I believe that social media, her addiction to social media has changed the way her brain works,” she added. “She has no long-term memory. She can’t live without a phone. She is willing to go to battle if you were even to touch her phone.”

The family is now demanding financial compensation — the amount has not been disclosed — as well as changes and a public apology from the creators and owners of the apps.

Last week, the family reached an “amicable” agreement with Snap, according to the company’s spokesperson. On Monday, just hours before jury selection for the trial was set to begin, they reached another agreement with TikTok; the figures are also unknown. But that will not speed up the trial, in which the lawyers intend to follow a strategy similar to the major lawsuits against tobacco companies in the 1990s, arguing that the product is far more addictive and causes greater long-term harm than initially known.

The Seattle-based Social Media Victims Law Center represents dozens of these victims. One of its attorneys, Matthew Bergman, told several media outlets last week that the public will finally learn “what social media companies have done to prioritize their profits over the safety of our kids.” “We are not talking about third-party content. We are talking about the reckless design of these platforms that are designed not to show kids what they want to see, but what they can’t look away from,” said Bergman, who is representing K. G. M. The attorney added that this is a “historic point” because for “the first time,” thousands of families will have the right to bring their cases before a court.

If cigarettes and opioids were the touchstones of addiction in the past, now these dangers are within reach from an early age. And children and their families are beginning to become aware of the side effects. That is why, for the first time, in what is expected to be a series of historic trials, some of the world’s largest and most powerful companies must sit in court and explain themselves. They may be acquitted, but that does not mean they will come out unscathed.

What is starting now is just the tip of the iceberg: there are more than 1,600 plaintiffs in California, and over 200 at the federal level. The hearings are expected to last years. Simply bringing TikTok or Instagram to court in each case is a formidable undertaking. For the K.G.M. trial, Mark Zuckerberg, who created Facebook 22 years ago and is now CEO of its parent company Meta, is expected to testify.

Big tech companies are either silent or waiting. They are relying on the First Amendment, which protects press freedom, as a shield. Specifically, they are invoking, as they have for years, Section 230 of the Communications Decency Act, which they claim exempts them from liability for content posted by users. But in this case, lawyers are not arguing about the content itself — they are questioning how the platform’s very design shapes that content and, therefore, affects users.

Some companies have spoken out. YouTube issued a statement: “The allegations in these complaints are simply not true,” a spokesperson said. “Providing young people with a safer, healthier experience has always been core to our work,” referring to parental controls and filters.

Meta published a lengthy post on its corporate website, titled Beyond the Headlines: Meta’s Record of Protecting Teens and Supporting Parents. The company is under intense scrutiny: less than a year ago, the U.S. Senate accused it of having “blood on its hands” and producing “a product that kills people.” In this post, Meta addresses the complaints, claiming that the legal battle “oversimplifies a serious matter” and that the “claims don’t reflect reality.” The company says it has implemented suicide prevention tools and, for example, created Instagram accounts specifically for teens with stricter limits.

But the evidence shows that this isn’t exactly the case. In November, a federal judge ordered the release of more than 5,800 pages of internal Meta emails and conversations, revealing that the company knew its products were highly addictive but prioritized engagement over user safety and mental health, ignoring the risks to families. For instance, in 2019 Instagram removed some beauty filters that altered users’ faces to impossible ideals — but in that year and the next, several executives asked Mark Zuckerberg to restore them, despite knowing how they could affect young users’ body image, especially girls. One executive even mentioned that one of his daughters had struggled with body dysmorphia. Yet the filters returned.

In fact, just this Monday, in another lawsuit filed in New Mexico, the state attorney general said that Meta and Zuckerberg “declined to impose reasonable guardrails” for minors. Attorney General Raúl Torrez alleges that Meta “failed to stem the tide of damaging sexual material and sexual propositions delivered to children” on both Facebook and Instagram. Some executives objected to chatbots launched in early 2024 that presented romantic and sexual scenarios to children and teens, even calling them “indefensible.” A former Meta executive and U.K. Deputy Prime Minister Nick Clegg — who was head of global policy at Meta until last year — expressed clear concern in an email about these sexual interactions: “Is that really what we want these products to be known for?”

In these lawsuits, Meta and other companies argue that mental health is not shaped by a single factor, such as social media, and that it is not scientifically proven that these platforms create addiction. However, former U.S. Surgeon General Vivek Murthy warned months ago that “teens who use social media for more than three hours a day face double the risk of depression and anxiety symptoms.” He even called for warning labels on social media similar to those on tobacco products.

A Pew Research Center study found that one in five teens believes social media harms their mental health, and half acknowledge that it is harmful for people their age, affecting issues like productivity and sleep. Some states, like California — which has banned cell phones in public classrooms — or countries like Australia, which set 16 as the minimum age to join social media, are trying to put limits on what, at least for now, remains largely unchecked.

Sign up for our weekly newsletter to get more English-language news coverage from EL PAÍS USA Edition

Tu suscripción se está usando en otro dispositivo

¿Quieres añadir otro usuario a tu suscripción?

Si continúas leyendo en este dispositivo, no se podrá leer en el otro.

¿Por qué estás viendo esto?

Flecha

Tu suscripción se está usando en otro dispositivo y solo puedes acceder a EL PAÍS desde un dispositivo a la vez.

Si quieres compartir tu cuenta, cambia tu suscripción a la modalidad Premium, así podrás añadir otro usuario. Cada uno accederá con su propia cuenta de email, lo que os permitirá personalizar vuestra experiencia en EL PAÍS.

¿Tienes una suscripción de empresa? Accede aquí para contratar más cuentas.

En el caso de no saber quién está usando tu cuenta, te recomendamos cambiar tu contraseña aquí.

Si decides continuar compartiendo tu cuenta, este mensaje se mostrará en tu dispositivo y en el de la otra persona que está usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aquí los términos y condiciones de la suscripción digital.

Archived In

Recomendaciones EL PAÍS
Recomendaciones EL PAÍS
_
_