

Trial lawyers are poised to accomplish in courtrooms nationwide what politicians have thus far failed to write into statute. The effects of this effort — undertaken without the deliberation of the nation’s representative bodies — are likely to rival those of even the most sweeping laws.
The product of social media platforms is not loaves of bread or pianos or widgets, it is speech, protected by the First Amendment.
A jury in Los Angeles is determining whether Meta and YouTube are liable for design features alleged to have substantially aggravated a young woman’s psychological disorders.
As thousands of similar lawsuits are ongoing — with more likely to follow — the determination of the Los Angeles jury will echo loudly in the deliberations of other juries across America.
These echoes will prove dissonant with Americans’ love for, and dedication to, free speech. Meta’s Instagram and YouTube were said to have disseminated speech too well, working too successfully to configure their products to maintain users’ interest.
This is supposed to constitute “addicting” their users. In fact, it is the aim of every business — from media organizations to retail stores to restaurants to attract and retain customers, to earn profits by marketing a product that consumers value.
In short, it is the business of entrepreneurs to give the people what they want. The product of social media platforms is not loaves of bread or pianos or widgets, it is speech, protected by the First Amendment.
Meta and YouTube are charged with having designed their products to include features — such as “infinite scroll” and individualized algorithmic recommendations — which allow and incentivize their users to view too much speech for too long.
As National Review’s Andrew McCarthy put it, “the plaintiff’s lawyers argued … a theory that the case was not about the content but about theprocesses by which the platforms present the content.” Despite titanic efforts to harden this distinction, it melts under the heat of elementary scrutiny. Platforms’ design features are impotent absent content that intrigues users.
Mike Masnick, editor of Techdirt, put it this way:
Here’s a thought experiment: Imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?
Social media algorithms sort and distribute speech — a function without which individuals could neither access speech online nor effectively find an audience for their own speech.
RELATED: The new censorship doesn’t say ‘no’ — it says ‘no one can see it’
Delihayat/Getty Images
Whatever the plaintiff’s attorneys contend, the liability imposed upon Meta and YouTube cannot be severed from the content they host and disseminate. Without the latter, the former would never be imagined, much less found by a jury.
The plaintiff in the case, a young woman known as Kaley or “KGM,” was brought up in anguishing conditions, the daughter of a mother who physically and emotionally abused her. She “was self-harming around when she was in the 6th grade,” reads the Associated Press account of the trial.
It is unsurprising that she, as a young girl, withdrew to social media to find something like peace, fulfillment, and satisfaction. It is equally unsurprising that she used social media to excess and leveraged her every chance to obtain engagement.
More generally, it is anything but certain that users’ affinity for social media is rightly termed an “addiction.” Likewise, research purporting to prove that social media has caused an epidemic of psychological disorders among children — the research of Jonathan Haidt, for example — has proven to be faulty, rife with faulty methodology and confirmation bias.
It is obvious that some misuse social media and their lives are, consequently, diminished. But this no more indicates that the platforms are “defective” in some legally cognizable sense than the mere existence of obesity in America indicates that McDonald’s or Taco Bell’s offerings are “defective” — or that fast-food restaurants ought to be held liable for occurrences of diabetes.
RELATED: Predatory gambling apps are using loopholes to avoid state laws
Gabby Jones/Bloomberg/Getty Images
Humans are a diverse bunch. That a minority, suffering from particular difficulties or vulnerabilities, cannot engage with this product or that in a healthy fashion should not, in a courtroom or the public square, constitute the basis of a totalizing rebuke.
Should the Los Angeles verdict stand, social media companies, confronted with the prospect of liability, are bound to remake their products to prevent any allegation — credible or otherwise — that their platforms cause or worsen whatever psychological distress from which users might suffer.
“If media companies must worry about liability whenever their expressive outputs are thought to be ‘harmful,’ the universe of available content would be reduced to the safest, blandest, and least engaging stuff imaginable,” warns Ari Cohn, the lead counsel for tech policy at the Foundation for Individual Rights and Expression.
The operations of Instagram and YouTube broke no law enacted by Congress or a state legislature to regulate the workings of social media. Even so, this litigation, if successful, will be regulatory in its effect, resulting in the contracting of the free and open internet.















