Juries have found a crack in the legal shield that has protected social media companies for three decades. They are not ruling on harmful content, but on harmful design.
Casey Newton explained on Hard Fork that plaintiffs are successfully arguing features like infinite scroll, autoplay, and push notifications are defective products. A Los Angeles jury ordered Meta and YouTube to pay $6 million; a New Mexico jury hit Meta with a $375 million verdict. The legal theory treats platforms like big tobacco, with internal documents - such as those revealed by Frances Haugen - serving as evidence that companies knew their products were addictive.
This product-liability side door circumvents Section 230 of the Communications Decency Act, which protects platforms from liability for user posts. Kevin Roose noted the difficulty in separating a platform's mechanical design from its editorial choices, but juries are currently ignoring that distinction in favor of public health claims.
The implications are systemic. If these verdicts survive appeal, the standard social media feed becomes a legal minefield. Newton predicts AI chatbots will be the next frontier for this liability debate, given their highly engaging and 'sticky' nature. A Pew study found 64% of teens already use them, with 3 in 10 doing so daily.
Casey Newton, Hard Fork:
- This is not about being harmed by a particular piece of content.
- This is about the design of the whole platform.
Parallel to the legal reckoning over design is a deeper economic shift driven by AI. Tristan Harris, on Modern Wisdom, argues AI is creating an 'intelligence curse' akin to the resource curse in petrostates. When data centers, not human workers, drive GDP, governments lose incentive to invest in their citizens. Sam Altman has suggested data centers are cheaper to scale than raising and educating humans.
This vision of AI as a replacement economy, not just a tool, reframes the mission of leading labs. Their goal to automate all cognitive labor - evidenced by AI already automating 90% of programming at Anthropic - could render the post-war social contract obsolete. The wealth transfer wouldn't just be between people, but from people to a handful of companies controlling the infrastructure.
Tristan Harris, Modern Wisdom:
- What makes AI different is that you're designing and you're not really coding it like I want it to do this.
- You're more like growing this digital brain that's trained on the entire internet.
The common thread is the exploitation of human psychology at scale. Social media found backdoors in the mind for engagement; AI threatens to scale that into total economic and social isolation. The legal system is now targeting the first wave of that exploitation, while the second, more profound wave accelerates unchecked.

