State attorneys general are aggressively targeting social media and artificial intelligence companies over inadequate protections for minors, filling a void left by a paralyzed Congress that has failed to pass comprehensive online safety legislation despite bipartisan concern.

From new investigations to landmark settlements and jury verdicts, state leaders are stepping up enforcement. Alabama Attorney General Steve Marshall (R) announced a $12.2 million settlement with Roblox this month, saying, “Alabama stepped in where others failed to act.” The gaming platform, used by nearly half of U.S. minors under 16, faced allegations of failing to shield children from harm. Similar agreements in Nevada and West Virginia require age verification for all Roblox users—a contentious issue that has stalled federal action due to privacy concerns.

Read also
Politics
Redistricting Endgame: Virginia and Florida Maps Set Up Midterm Showdown
Virginia's new Democratic-leaning map and Florida's GOP-backed plan could decide House control. Both parties jockey for gains as the redistricting cycle ends.

“What makes AGs a unique creature is that we do have the ability to change business practices through injunctive relief,” Marshall told The Hill, calling state prosecutors an “underdiscussed player” in areas where Congress won’t act.

The state-level push extends beyond Roblox. Meta, TikTok, Discord, Snap, and Reddit are under investigation in at least seven states over their safety features. AI companies are also in the crosshairs: Florida Attorney General James Uthmeier (R) launched a criminal probe into OpenAI after a suspect allegedly used ChatGPT to plan a double murder. “If ChatGPT were a person, it would be facing charges for murder,” Uthmeier posted, adding that AI is being used to create child sexual abuse material and advise self-harm.

Courtrooms have become a battleground. In recent weeks, juries in New Mexico and California found Meta and Google liable for platform designs that harm kids—the first such verdicts, bypassing the long-standing Section 230 liability shield. “The dam is breaking,” said Sen. Dick Durbin (D-Ill.), who has pushed to repeal Section 230. New Mexico Attorney General Raúl Torrez (D), who led the Meta case, compared the moment to 1990s Big Tobacco settlements, telling a Capitol crowd, “It was in the courtrooms where individual juries had their say, and this is our moment.”

Sen. Josh Hawley (R-Mo.) urged Congress to “take a leap from their playbook” and treat the verdicts as a wake-up call. Yet federal efforts remain mired in intraparty disputes, procedural hurdles, and clashes between chambers. A major kids’ online safety bill has stalled despite broad bipartisan support and advocacy pressure.

State officials say they cannot afford to wait. “Parents are not asking for permission any longer,” said John Cusey of the Institute for Families and Technology. “When you talk to them on the state level, they don’t care if it’s Republican or Democrat.”

The state actions echo broader trends: a recent warning from a tech critic that social media obsession could trap lawmakers in a digital prison underscores the stakes. Meanwhile, Meta’s Zuckerberg has faced criticism for prioritizing growth over child safety, a pattern state lawsuits aim to disrupt.

As the federal impasse persists, state prosecutors are reshaping the regulatory landscape—one investigation, settlement, and verdict at a time.