Jennifer Hanley, Meta’s head of safety policy for North America, is pushing back against critics who accuse the company of half-measures in protecting teens online. In a new statement, she argues that collaboration—both across the tech industry and with lawmakers—is essential to tackling online exploitation.

Hanley, a parent herself, acknowledges the anxiety many families feel about internet safety. She recalls the moment her own child first video-called grandparents, a reminder of both the promise and peril of digital life. “Striking a balance between safety and opportunity will help my kids grow and succeed,” she writes.

Read also
Technology
Wordle's Workplace Impact: Brain Training or Time Waster?
Millions play Wordle daily, but its effect on workplace productivity is debated. Experts say brain games can help, but traditional socialization may be better.

Meta has rolled out Teen Accounts, which automatically place users under 18 into settings that limit who can contact them and what content they see. Teens under 16 need parental permission to loosen those restrictions. The accounts block unknown adults from initiating private chats and use specialized technology to flag suspicious accounts. For parents who want more control, supervision tools allow setting daily time limits as low as 15 minutes, viewing recent message history, and receiving alerts if a teen repeatedly searches for terms related to self-harm.

Hanley dismisses claims that Meta relies on “delays, excuses, and half measures,” insisting that millions of teens benefit from these protections daily. She points to partnerships with nonprofits like Childhelp, which developed a curriculum on recognizing grooming and sextortion scams. That program reached more than 1.5 million middle schoolers in its first year, according to Childhelp.

Despite these efforts, Hanley acknowledges that predators adapt. Meta’s investigators work with law enforcement to identify and disrupt criminal networks, and the company reports more instances to the National Center for Missing and Exploited Children than any other tech firm. But she stresses that predators operate across platforms, making industry-wide coordination critical.

That’s where the Tech Coalition’s Lantern program comes in. Meta co-created the initiative, which allows participating companies to share “signals” about potentially predatory accounts. The latest transparency report shows that tech companies have shared more than 2 million signals since Lantern launched three years ago, including nearly 1 million in 2025 alone.

Hanley also calls for federal legislation that would let parents approve their teens’ app downloads directly from app stores, as well as industry-wide age-appropriate content standards. “These are issues that no one person—or company—can address alone,” she writes.

The push for broader safeguards comes as Meta faces scrutiny over its handling of teen safety and as the company continues to invest in AI and other technologies. The debate over online child protection remains a flashpoint in Washington, with lawmakers weighing new regulations that could reshape how platforms operate.

Hanley joined Meta seven years ago, drawn by its safety commitments. She says she stays because she sees the dedication of colleagues who are “supporting young people” through tools and partnerships. “By collaborating with others and building platforms that help kids learn, grow and explore,” she concludes, “each of us can play a role in keeping teens safe.”