March 30
Read chapter 2 of Baase on privacy.
Meta and YouTube lost a state-court case last week in which plaintiff KGM alleged that design features of the platforms led to her addiction. Earlier, New Mexico won a case alleging that Meta had misled its users about safety on the platform from sexual predators.
Both cases skirted Section 230 by arguing that the platform's design was at fault.
Both cases will be appealed.
The cases seem to focus on features like "selection algorithms", "autoplay" and "doomscrolling". While collectively these do create difficulties for some people, demonizing these features is also problematic.
Most selection algorithms are variations on "other people who have liked what you have liked have also liked this". In the Supreme Court's 2024 decision Moody v NetChoice, Justice Kagan wrote that "expressive activity includes presenting a curated compilation of speech originally created by others". That's algorithms.
Compare this to a novel that detailed how a major character caused self-harm. Should an author or publisher be liable for that?
Doomscrolling may be a problem with some content, but then it is the content that is the problem. Google search has implemented doomscrolling (page by page) since 2002; nobody has complained.
Here's an argument from www.techdirt.com/2026/03/26/everyone-cheering-the-social-media-addiction-verdicts-against-meta-should-understand-what-theyre-actually-cheering-for/:
Here’s a thought experiment: imagine Instagram, but every single post is a video of paint drying. Same infinite scroll. Same autoplay. Same algorithmic recommendations. Same notification systems. Is anyone addicted? Is anyone harmed? Is anyone suing?
The New Mexico case focused heavily on the fact that Facebook Messenger supports end-to-end encryption, which led to messages that the police could not monitor. If that decision stands, end-to-end encryption is toast. There are some benefits to that. But does that really make the world better?
The jurors in the KGM case were asked if the social-media companies' behavior was a substantial factor in the harm. There were clearly a host of other factors.
There are lots of claims that social media is just plain harmful to those under 18. But the evidence is not really very strong. This is not like cigarette smoking. See www.techdirt.com/2023/12/18/yet-another-massive-study-says-theres-no-evidence-that-social-media-is-inherently-harmful-to-teens.
Facebook and Google can keep on, even if they lose their appeals. But new social-media startups? No.
Cox Communications v Sony
The Supreme Court threw out a copyright-infringement case charging ISP
Cox with "contributory infringement" because they didn't suspend the
service of very many customers who had been visiting file-sharing sites.
In its unanimous decision, the Supreme Court seemed reluctant to put a
great deal of stock in "indirect liability", at least for copyright.
Debates
Source Code as Speech