Meta and Instagram must face Massachusetts’ lawsuit claiming that the social media companies engaged in deceptive business practices and created a public nuisance by designing features that are harmfully addictive to teens and kids.
In a unanimous ruling issued on April 10, the state Supreme Judicial Court refused to apply §230 of the Communications Decency Act of 1996 to immunize Meta and Instagram against claims that they designed the Instagram app “to induce compulsive use by children” and “deliberately misle[d] the public about the safety of the platform.” That section of the CDA protects online platforms from liability for content posted on them.
Meta and Instagram, which was acquired by Meta in 2012, have been targets of an onslaught of lawsuits blaming the addictiveness and harmfulness of their apps for the mental health of teens and children.
In March, a New Mexico jury returned at $375 million verdict in a case based on claims that the social media companies were aware their apps were harming kids. The next day, a Los Angeles jury awarded $6 million in damages to a 20-year-old woman who claimed she became addicted to Meta’s and Google’s platforms.
State attorneys general have joined the fray, too. A coalition of 33 states are pursuing a case against Meta and Instagram in federal court alleging violations of state deceptive business practice and nuisance laws.
In the Massachusetts case, the Commonwealth identified several aspects of the design and development of Instagram that it alleged amounted to deceptive business practices: the sending of excessive notifications to users, allowing endless scrolling and autoplay of reels and stories, limiting the amount of time certain posts and stories can be viewed – causing a “fear of missing out” by users – and providing “like” notifications and refreshed content on a variable and unpredictable schedule.
At the same time it was designing these addictive and harmful features into the app, according to the Commonwealth, Meta allegedly was misrepresenting to the public that its product is safe and nonaddictive for teens and children and deceptively claiming that Instagram prevents underage users from accessing the platform despite knowing that the age-gating it uses is ineffective.
These actions, the Commonwealth claimed, are not only deceptive business practices but also create a public nuisance.
In its defense, Meta argued to the trial court and the Massachusetts high court that the case should be dismissed because the focus of the Commonwealth’s complaint “depend[s] on content posted by users of Meta’s services,” a claim barred by §230(c)(1) of the CDA.
“Section 230 gives providers of interactive computer services like Meta immunity from suit—not just immunity from liability—for traditional publishing activity,” Meta argued.
In particular, §230(c)(1) bars claims against “interactive computer services” for “’any information provided by another information content provider.’”
“Each of those elements is met here, where the Commonwealth’s claims either target so-called ‘design features’ that Meta uses to decide whether and how to organize, display, and disseminate—i.e., publish— third-party content, or seek to hold Meta liable for third-party content itself,” the company argued.
In its brief supporting Meta and Instagram, the Washington Legal Foundation explained that “[w]hile the government’s pleadings use lots of high-tech verbiage about ‘never-ending feeds,’ ‘haptics,’ and ‘algorithms,’ Massachusetts’ core dispute is about Meta’s distribution of third-party content. That’s publishing—and publishing is protected to the hilt by section 230’s internet-freedom immunity.”
The Foundation for Individual Rights and Expression warned against continuing an oft-repeated trend of encroachment on speech rights stemming from “claims that new media is ‘addicting’ youth, causing immeasurable harm.”
“[R]adio and film crime dramas were described as a ‘habit-forming practice very difficult to overcome’ leading to increased nervousness and fear, and the ‘chronic stimulation’ of comic books were thought to be ‘contributing factors to many children’s maladjustments.’ The Commonwealth of Massachusetts continues that trend here,” it said.
“Publication is not a public nuisance,” the WLF argued. “Massachusetts may not use its consumer protection and tort laws to ‘abridg[e] the freedom of speech, or of the press,’ even if young people really like Instagram’s endless feed and respond to the app’s incessant notifications.”
The Court rejected these arguments. Section 230 immunity, Justice Dalila Argaez Wendlandt wrote, “requires that ‘another information content provider’ supply the relevant information upon which liability is asserted.”
“[T]he claims do not seek to impose liability on Meta for information provided by third parties,” Wendlandt reasoned. “Instead, the claims allege harm stemming from Meta’s own conduct either by designing a social media platform that capitalizes on the developmental vulnerabilities of children or by affirmatively misleading consumers about the safety of the Instagram platform.”
Having concluded that Instagram’s design features are conduct not covered by the CDA, the high court affirmed the trial court’s order refusing to dismiss the complaint “as it pertains to § 230(c)(1).” Notably, the court did not address Meta’s First Amendment defense.
In response to its court losses, Meta began pulling ads by plaintiffs firms that recruit would-be plaintiffs for lawsuits against social media companies.
“We will not allow trial lawyers to profit from our platforms while simultaneously claiming they are harmful,” the company declared.





