
Opinion
Social media is the curse of our time
by Samuel Buchmann

Instagram and YouTube are on trial in the USA, accused of designing products that endanger children. The proceedings could have a massive impact.
A groundbreaking lawsuit against Meta and Google is underway in Los Angeles. For the first time, a jury will decide whether major social media platforms have deliberately designed their content to make young people addicted – with negative consequences for their mental health. The trial is being viewed as a landmark case for thousands of other lawsuits against the industry.
Part one of the proceedings focuses primarily on Instagram. Its boss Adam Mosseri was called as a witness, Meta CEO Mark Zuckerberg also had to show up. Below, you’ll find answers to the seven biggest questions surrounding the trial. You can also find my opinion on the whole story here:
The specific case centres on a 20-year-old woman named under the initials KGM. She was already using YouTube as a six-year-old child, set up an Instagram account at the age of nine and was later also active on Snapchat and TikTok. KGM is accusing the platforms of making her addicted. Consequently, the companies behind them are partly responsible for depression, anxiety disorders, body image disorders and suicidal thoughts.
Meta (Instagram) and Alphabet (YouTube) are under the spotlight. Snap and TikTok got out of the case out of court before the trial began. Nevertheless, they’re indirectly affected since the ruling will set a precedent for countless other lawsuits.

The plaintiff’s lawyers aren’t concerned with individual content, but the product design of the platforms themselves. They’re comparing it to cigarettes, with similar potential for addiction. This link is a legal strategy. Tobacco companies were brought to court in the 1990s because they’d deliberately played down health risks. The central accusations in the first trial against Instagram and YouTube:
The prosecution’s legal argument is based on product liability: social media apps are defective products because they are addictive and don’t adequately protect young users.
The companies are defending themselves against the accusation that they deliberately created addictive products. Their central arguments:
The companies also invoke Section 230 of the US Communications Act. This law largely releases platforms from liability for third-party content. They enjoy freedoms that traditional media such as newspapers are denied. The companies argue that harmful effects are caused by user content, not platform design.
The trial is just one in a series of so-called bellwether trials. In the USA, many similar individual lawsuits have been bundled into class action cases, an alternative to conducting thousands of identical proceedings. From these, a few representative cases have been selected and are taking place before a jury. The results will set precedents for the remaining lawsuits. Nine such bellwether cases are scheduled in Los Angeles. In addition, separate class action lawsuits are being brought forward by school districts and states.
These test cases are intended to show how courts react to the presentation of evidence, which arguments carry weight and what damages are realistic. The judgments aren’t formally binding for other lawsuits, only serving as a guide. If the jury decides in favour of the plaintiffs, the chances of high settlements in the remaining cases increase. Conversely, defeats lessen the chance of success across all the other lawsuits.
It’ll be a tall order. Firstly, the plaintiffs have to prove that the platform design is defective and promotes addiction. Secondly, that it was a significant factor in the plaintiff’s psychological trauma in the individual case. The latter isn’t easy. In contrast to tobacco products, for example, scientific causal evidence is scarce.

On the other hand, the plaintiffs have, according to their own statements, secured hundreds of thousands of pages of internal documents intended to prove the companies were aware of the risks. The judge rejected the blanket invocation of Section 230 in the pretrial proceedings. She stated that freedom from liability doesn’t include feature design. The tech companies are by no means a shoo-in for winning the suit. Rather, the proceedings are a test of whether US courts will treat social media platforms as strictly as they once treated tobacco companies.
A clear victory for the plaintiff would have several consequences. Financially, the platforms are being threatened with high compensation and settlement payments. After all, thousands of similar lawsuits are pending. Even a partial victory would have a signalling effect and could lead to many out-of-court settlements.
The courts could also force the companies to restrict or deactivate certain features for minors. For example, beauty filters, aggressive push notifications and gamification mechanisms. Pressure to regulate the platforms more strictly would also increase on a political level. Both could lead to less usage time and therefore less advertising revenue. The share prices of all companies accused would likely fall in the event of a guilty verdict.
Further witnesses will initially be called in the ongoing proceedings, including managers at Meta and Alphabet, psychiatrists, media psychologists and experts on platform functions. In the end, the jury will have to decide whether the design of Instagram and YouTube contributed significantly to the plaintiff’s suffering. That should take around two months. Regardless of the outcome, further test trials will follow in the coming months. Only an overall view of several judgments will show whether a clear tendency in favour of the plaintiffs or the corporations emerges.
The losing parties can appeal the verdicts, a certainty on both sides. The Superior Court in Los Angeles is the first level, followed by the Court of Appeals of the State of California. There, three judges will decide instead of a jury. They’ll check whether the first ruling applied the law correctly – in this case regarding product liability. Only the Court of Appeals can address the question of whether Section 230 also applies to design and algorithms and whether a conviction affects freedom of speech (First Amendment).

The losing side can then try to appeal to the Supreme Court of California. However, the latter doesn’t have to accept the case. It only deals with fundamental legal issues, such as the scope of Section 230 in this case. The final instance is the Supreme Court, which also only selects a fraction of cases. Due to the national scope of the proceedings, this can’t be ruled out. As a result, it could take several years before a final verdict is reached; at least two years are realistic.
My fingerprint often changes so drastically that my MacBook doesn't recognise it anymore. The reason? If I'm not clinging to a monitor or camera, I'm probably clinging to a rockface by the tips of my fingers.
Interesting facts about products, behind-the-scenes looks at manufacturers and deep-dives on interesting people.
Show all