Meta ignored child sex abuse in VR, say whistleblowers

Two former employees at Meta testified against the company at a Senate hearing this week, accusing it of downplaying the dangers of child abuse in its virtual reality (VR) environment.

The whistleblowers say they saw incidents where children were asked for sex acts and nude photos in Facebook’s VR world, which it calls the ‘metaverse’. This is a completely immersive world that people enter by wearing a Meta virtual reality headset. There, they are able to use a variety of apps that that surround them in 360-degree visuals. They can interact with the environment, and with other users.

At the hearing, held by the US Senate Judiciary Subcommittee on Privacy, Technology and the Law, the two former employees warned that Meta deliberately turned a blind eye to potential child harms. It restricted the information that researchers could collect about child safety and even altered research designs so that it could preserve plausible deniability, they said, adding that it also made researchers delete data that showed harm was being done to kids in VR.

“We researchers were directed how to write reports to limit risk to Meta,” said Jason Sattizahan, who researched integrity in Meta’s VR initiative during his six-year stint at the company. “Internal work groups were locked down, making it nearly impossible to share data and coordinate between teams to keep users safe. Mark Zuckerberg disparaged whistleblowers, claiming past disclosures were ‘used to construct a false narrative’”.

“When our research uncovered that underage children using Meta VR in Germany were subject to demands for sex acts, nude photos and other acts that no child should ever be exposed to, Meta demanded that we erase any evidence of such dangers that we saw,” continued Sattizahan. The company, which completely controlled his research, demanded that he change his methods to avoid collecting data on emotional and psychological harm, he said.

“Meta is aware that its VR platform is full of underage children,” said Cayce Savage, who led research on youth safety and virtual reality at Meta between 2019 and 2023. She added that recognizing this problem would force the company to kick them off the system, which would harm its engagement numbers. “Meta purposely turns a blind eye to this knowledge, despite it being obvious to anyone using their products.”

The dangers to children in VR are especially severe, Savage added, arguing that real-life physical movements made using the headsets and their controllers are required to affect the VR environment.

“Meta is aware that children are being harmed in VR. I quickly became aware that it is not uncommon for children in VR to experience bullying, sexual assault, to be solicited for nude photographs and sexual acts by pedophiles, and to be regularly exposed to mature content like gambling and violence, and to participate in adult experiences like strip clubs and watching pornography with strangers,” she said, adding that she had seen these things happening herself. “I wish I could tell you the percentage of children in VR experiencing these harms, but Meta would not allow me to conduct this research.”

In one case, abusers coordinated to set up a virtual strip club in the app Roblox and pay underage users the in-game currency, ‘Robux’, to have their avatars strip in the environment. Savage said she told Meta not to allow the app on its VR platform. “You can now download it in their app store,” she added.

This isn’t the first time that Meta has been accused of ignoring harm to children. In November 2023, a former employee warned that the company had ignored sexual dangers for children on Instagram, testifying that his own child had received unsolicited explicit pictures. In 2021, former employee Frances Haugen accused the company of downplaying risks to young users.

Facebook has reportedly referred to the “claims at the heart” of the hearing as “nonsense”.

Senator Marsha Blackburn, who chaired the meeting, has proposed the Kids Online Safety Act to force platforms into responsible design choices that would prevent harm to children.


We don’t just report on threats—we remove them

Cybersecurity risks should never spread beyond a headline. Keep threats off your devices by downloading Malwarebytes today.

Read More

Scroll to Top