Roblox sued by Southern California families alleging children met predators on its platform

Video game platform Roblox is facing new lawsuits from parents who claim the San Mateo, California-based company is not doing enough to protect children from sexual predators.

A Los Angeles County mother, who was not identified in a November lawsuit, claims her daughter met a predator on Roblox who convinced her child to send sexually explicit photos of herself through the social media platform Discord. A woman has sued Roblox and the San Francisco-based company Discord.

According to the lawsuit filed in Los Angeles County Superior Court, when her daughter signed up for the gaming platform last year at age 12, the woman thought Roblox was safe because it was intended for children and was educational.

But then her daughter became friends with a man on Roblox known as “Precious,” who claimed to be 15 and told her child that she was abused at home and had no friends, the lawsuit says. Her daughter, accompanied by a friend's parents, met a Roblox user on the beach, and the man turned out to be older and tried to introduce her to a group of older men.

After they met, the predator tried to persuade the girl to visit one of her apartments in Fullerton and tried to alienate her from her family. According to the lawsuit, the child suffered psychological trauma, depression and other emotional distress due to his experiences on Roblox and Discord.

The lawsuit accuses Roblox and Discord of putting profit before safety, creating a “digital” and “real-life nightmare” for children. He also claims that the companies' failures are systemic and that other children have also suffered from encounters with predators on the platforms.

“Her innocence was taken from her and her life will never be the same,” the lawsuit states.

In a statement, Roblox said it is “deeply concerned by any incident that puts any user at risk” and prioritizes online safety.

“We also understand that no system is perfect, which is why we are constantly working to further improve our security tools and platform restrictions so parents can trust us to keep their children safe online, launching 145 new initiatives this year alone,” the statement said.

Discord said it is committed to safety and requires users to be at least 13 years old to use its platform.

“We maintain robust systems to prevent the spread of sexual exploitation and grooming on our platform, and work with other technology companies and security organizations to improve online safety on the Internet,” the company said in a statement.

The lawsuit is the latest focus on Roblox, a platform popular among young people. More than 151 million people use it daily. Earlier this year, the platform faced a wave of lawsuits from people in different states who claim predators are posing as children on the platform and sexually exploiting children.

NBC4 Newswho previously reported on the lawsuit also reported that Roblox is facing another lawsuit from a California family in Riverside who claim their child was sexually abused by a man the child met on Roblox. The man was sentenced to 15 years in prison.

This year, Roblox has taken new steps to address growing child safety concerns. In November, the company said it would require users to verify their age in order to chat with other players. Roblox users will provide ID or take a video selfie to prove their age. The verification function estimates a person's age, which allows the company to limit communication between children and adults.

The Los Angeles County woman's lawsuit calls the security changes Roblox made in 2024 “woefully inadequate” and says they came “too late.”

“All of these changes could have been implemented many years ago,” the lawsuit states. “None of them involve any new or revolutionary technology. Roblox only moved forward when its stock came under threat.”

Leave a Comment