Meta, TikTok and Others to Testify in Senate Child Safety Hearing: Live Updates

Large leather chairs behind a bench in a mostly empty hearing room at the US Capitol.

Five of the most prominent chief executives in tech will face questions on Wednesday from a powerful Senate committee about an issue that has drawn rare bipartisan scrutiny: the dangers that children encounter online.

Members of the Senate Judiciary Committee will grill the leaders of Meta, TikTok, Snap, Discord and X on topics including the online spread of child sexual abuse material and efforts to police it. They’ll also examine the social media companies’ broader impact on children’s safety and mental health as calls increase for platforms to be held responsible for protecting young people.

  • A backlash against the tech platforms has mounted after accusations that the companies knew they hosted underage users and that their products could be harmful. Several U.S. states have passed laws requiring social media services to verify their users’ ages or take other steps to protect young people, although they are facing legal challenges. Online safety laws have been approved in the European Union and in Britain.

  • Unusually, three of the executives — Evan Spiegel of Snap; Jason Citron of Discord; and Linda Yaccarino of X, the social network formerly called Twitter — had to be subpoenaed to testify. (Mark Zuckerberg, the chief executive of Meta, which owns Facebook and Instagram, and the TikTok chief executive Shou Zi Chew will also testify.)

  • Advocacy groups have for years pushed the government and companies to do more to take child sexual abuse material down. The recent expansion of artificial intelligence products has fueled fears the technology will create a surge in abusive images. In 2022, the National Center for Missing and Exploited Children received more than 32 million reports of such material from internet platforms and other sources.

  • Over the past year, the Senate Judiciary Committee has approved several bills, including one penalizing tech companies that do not remove content at the request of victims. Another would strip social media platforms of a legal shield that has protected them from lawsuits over content posted by their users. None of the proposals has become law.

  • While the hearing is focused on the exploitation of children online, lawmakers may also ask questions about their products’ effects on youths. One proposal would place a legal duty on the companies to prevent harms to minors.


Related posts