Skip to content
THURSDAY, MARCH 5, 2026
Consumer Tech3 min read

Meta downplays its own safety research in NM trial

By Riley Hart

Child interacting with educational robot toy

Image / Photo by Andy Kelly on Unsplash

Meta’s own safety research got a shrug from its founder-on-the-stand during a New Mexico child-safety trial, raising fresh questions about what big platforms know and how they talk about it in court.

Jurors in the case heard pre-recorded testimony from Mark Zuckerberg, who faced questions about internal studies on social-media addiction and other harms that researchers at Meta had tracked for years. The deposition, recorded last March, highlighted a tension between what the company’s researchers saw and how the CEO described those findings in a public-facing, business-focused context. In one early exchange, lawmakers surfaced a document about how feedback on Facebook posts can condition users to return to the platform, with a chart suggesting that posting feedback “leads contributors to seek rewards by visiting the site more often.” Zuckerberg responded that he wasn’t “sure if that’s actually how it works in practice,” but he agreed that the spokesperson’s summary captured what the document appeared to say.

The testimony also drew attention to a chart that showed roughly 20 percent of 11- and 12-year-olds were monthly Instagram users at the time, with Zuckerberg acknowledging uncertainty about the methodology used to estimate those numbers. The jurors saw a portrait of a company weighing complex, long-running research against rapid product priorities and the public-facing claim that their platforms are designed with safety in mind. The trial’s context is critical: it centers on child-safety concerns and the broader question of whether tech giants disclose meaningful risk information to users, regulators, and the courts.

From a consumer-technology lens, the episode underscores a recurring dilemma: internal research that maps subtle behavioral effects versus the risk-communication standards that govern consumer trust. Tech platforms routinely commission long-term studies to understand engagement, but the same institutions often face pressure to present safety features in a light that supports continued use and monetization. In this case, the deposition illustrates how the same data can be interpreted differently depending on who’s describing it and for what purpose.

For everyday readers, the moment matters beyond court rooms. If the trial findings tilt toward confirming that internal research showed harmful patterns—yet public messaging downplayed those patterns—regulators and lawmakers may demand more independent validation. In parallel, consumer advocates are watching for clearer disclosures about how platforms influence behavior, especially among underage users and vulnerable groups.

Two to four practitioner-level takeaways emerge for the wider tech ecosystem. First, the gap between internal research and external messaging is a central risk factor for trust and legal exposure; companies should consider more transparent, third-party verification of studies with independent auditors. Second, age-related usage data are notoriously fragile—methodology, sampling, and privacy constraints can all distort what “20% of 11- to 12-year-olds” means in practice, affecting both policy discussions and parental decision-making. Third, the trial spotlights the ongoing tension between product optimization and safety safeguards; if regulators push for stricter safeguards, product teams will need clearer guardrails that don’t unduly throttle innovation. Fourth, the scene foreshadows what to watch next: how Meta, and other platforms, will frame internal findings in public testimony and regulatory filings as lawmakers push for stronger protections around youth online experiences.

Full price with all subscription fees broken out: not applicable here, since the episode centers on policy, safety research, and deposition testimony rather than a consumer product you buy or subscribe to. Meta’s consumer apps are free to use, with revenue primarily from advertising rather than direct user subscriptions. What matters for readers is how the company interprets internal findings in public statements and what this implies for future safety commitments and oversight.

Sources

  • Mark Zuckerberg downplays Meta's own research in New Mexico child safety trial

  • Newsletter

    The Robotics Briefing

    Weekly intelligence on automation, regulation, and investment trends - crafted for operators, researchers, and policy leaders.

    No spam. Unsubscribe anytime. Read our privacy policy for details.