The Massachusetts Supreme Judicial Court delivered a significant procedural victory to state prosecutors on Friday, ruling that Attorney General Andrea Joy Campbell's lawsuit against Meta Platforms can proceed. The unanimous decision allows Campbell's office to pursue claims that Meta's platform design—not user-generated content—harms children and teenagers through addictive features.
Justice Dalila Argaez Wendlandt, writing for the court, determined that Meta failed to demonstrate it was "entitled to the protection" of Section 230 of the Communications Decency Act at this stage of litigation. This federal statute has historically shielded technology companies from liability for third-party content posted on their platforms. The court's reasoning hinges on Campbell's strategic focus: her complaint targets Meta's alleged conduct in creating and operating Instagram's architecture, which she claims is deliberately engineered to foster addiction among young users.
A Strategic Legal Distinction
"This case targets Meta's alleged conduct in designing a social media platform, rather than the content posted by third-party users," the court noted, drawing a legal line that could have far-reaching implications for the tech industry. Campbell's lawsuit alleges that Meta misled consumers about Instagram's safety while deploying design elements that exploit psychological vulnerabilities in young people.
The Massachusetts action joins dozens of similar cases filed nationwide against social media and artificial intelligence companies. It arrives immediately following two major jury verdicts that found Meta and Google's YouTube liable for their platforms' impact on youth. In California, a jury ordered the companies to pay a combined $6 million for negligent platform design. A day earlier, a New Mexico jury awarded $375 million against Meta for compromising children's safety online and violating state consumer protection laws.
Industry Pushback and Political Context
Meta immediately challenged the court's reasoning. A company spokesperson told reporters, "We continue to disagree with the false distinction between content and platform design. This ruling is procedural and doesn't address the merits of the case." The spokesperson emphasized Meta's "longstanding commitment to supporting young people" through safety tools and collaborations with experts and law enforcement.
Campbell, a Democrat, celebrated the decision as a "victory" on social media platform Bluesky. "Meta tried to get our lawsuit against them for fueling the youth mental health crisis thrown out, but unfortunately for them, the courts just ruled that the case will continue," she wrote. "As your AG, and as a mom, I'll always fight to demand accountability from big tech and safer spaces online."
Legal observers predict these verdicts could pave the way for more aggressive state legislation targeting platform design, creating a new regulatory frontier beyond content moderation. The cases test whether traditional product liability frameworks can be applied to digital environments, potentially circumventing the broad protections companies have enjoyed under Section 230.
The ruling occurs amid heightened judicial scrutiny of technology companies and their legal defenses. This development follows other significant court actions challenging established norms, including a Supreme Court decision that jeopardizes numerous state professional licensing laws and a landmark 8-1 ruling striking down Colorado therapy restrictions on free speech grounds. These decisions collectively signal a judiciary increasingly willing to reexamine long-standing legal doctrines across multiple sectors.
As the case moves forward in Massachusetts, it will be closely watched by attorneys general in other states considering similar actions. The outcome could determine whether platform design becomes the next major battleground in regulating technology companies, shifting the legal focus from what users post to how companies architect engagement.
