Content note: This story discusses cases of children being solicited for sexual activity and receiving sexually explicit messages, as well as self-harm.
How responsible are they? Roblox developer Roblox Corporation. and the communication platform Discord for illegal behavior of users on their platforms?
This is a question both companies face. a growing series of lawsuits, many filed by law firm Anapol Weiss. The firm represents a number of families whose children have been victims of predators. Roblox. Some of these predators encouraged these minors to communicate with them through Discord in order to sexually exploit them, first electronically and then physically.
Lawsuits follow reporting years about how RobloxAllegedly weak moderation protocols appear to have enabled the exploitation of children through a combination of lax age-determination protocols hosted by custom games that are sexually explicit. Roblox Corp. and Discord have both presented row security improvements last year (with the launch of Roblox new age verification measures landing only this month), but according to some plaintiffscompanies should have done more to protect users years ago.
Last week, when asked about this topic, the CEO of Roblox Corp. David Baszucki became militant with New York Times reporters, answering repeated questions about the company's safety record.
Both companies have repeatedly denied any negligent practices. And they're headed to court as the case law appears to be leaning in their favor. This is because of a federal law known as Communications Decency Act. But with the safety of so many young players at stake, it's worth asking: How does the law apply to these companies?
Section 230 generally protects companies that host user-generated content.
First enacted in 1934, the law was updated in 1996 to include a clause known as “Section 230,” which provides limited federal immunity to “providers and users of interactive computer services.” His shielded telecommunications companies and social media platforms from legal liability for content posted by their users. For example, if someone on Facebook falsely accuses you of a crime, you can sue that user for defamation, but not Facebook owner Meta.
These companies also offer civil immunity for removing obscene content or content that violates terms of service—even constitutionally protected speech—from their platforms, as long as the removal is done in “good faith.” The law does not provide immunity from criminal offenses, state civil laws or other cases. This may mean that it does not apply to claims filed by states Florida, LouisianaAnd Texas.
Cases like Jane Doe v. America Online Inc. and M.A. against. Village voice set a precedent for lawsuits against Roblox Corp. and Discord. In both cases, the defendants were accused of aiding and abetting the sexual abuse of minors, but federal courts ruled that the companies had civil immunity under Section 230.
Lawyers for plaintiffs suing Roblox and Discord say it's not about hosted content
Alexandra Walsh, an Anapol Weis lawyer representing parents suing the company, told Game Developer that her firm took on the case with the intention of “giving victims a voice,” a motivation that is “at the heart” of the firm. “What started as a few complaints has escalated into a wave of litigation as families across the country realize they are victims of the same Roblox and Discord system failures in protecting their children,” she said.
Walsh said Section 230 is “irrelevant” to her clients' claims. “Roblox will and does use it because every tech company automatically uses it when they get sued,” she said. “But they are overinterpreting the application of this law. In our view, this law is intended to limit liability in cases where an Internet service provider … publishes someone else's material.”
She said the company's cases centered on how these apps were released without adequate security features while allegedly misrepresenting their protections for underage users. Adult predators could create profiles that signaled they were children, and children could register for accounts without contacting their parents.
Game developers, however, can recognize that the phenomenon of underage users signing up for online games or services without parental permission is as old as… the Internet. When asked about this, Walsh said there are differences in how other platforms, such as Instagram, “some try” to enforce their minimum age policies, and how Roblox ensures minimal hassle when minor users register on the platform.
“We're not saying that any particular measure will be perfect 100% of the time,” she said, alluding to age restrictions that might, for example, require a parent's email address to create an account. “But at least it's causing some controversy…at least it's making some kids think.”
Walsh said it is “easy” for children on Discord to disable parental controls without their parents' knowledge. Predators use this opportunity to lure their targets and lower their defensive barriers. A better system might be one that automatically notifies parents when these controls are disabled.
Both platforms are connected via RobloxDiscord integration. A Florida predator who abused Ethan Dallas, the child of one of Walsh's clients, reportedly lured Dallas out of the country. Roblox and on Discord, where he was able to continue to sexually exploit the teenager.
Dallas committed suicide in April 2024.
“Roblox is a gaming platform that is heavily promoted as safe and suitable for children,” Walsh said. “At the same time, the company knows that child predators are coming to the platform every day.” She referred to regular reports that Roblox Corporation sends to the Center for Missing and Exploited Children, as well as news highlighting as evidence of this fact the arrests of predators who preyed on minors on their platform.
And yet, despite all this Roblox and disagreements may still be protected by Section 230 in these civil cases.
Proving Section 230 Doesn't Apply May Be Difficult
Electronic Frontier Foundation attorney Aaron McKee—the nonprofit's director of free speech and transparency litigation—acknowledged that it is difficult to distinguish between liability and responsibility when it comes to protecting children online. The fund has was a strong supporter of Section 230arguing that while some elements of the Communication Decency Act were flawed, the law provides vital protections for free speech online.
McKee declined to comment on the specifics of the cases against Roblox Corp. and Discord. But in a conversation with Game Developer, he explained that communications platforms of all stripes have been repeatedly found not liable for offensive messages sent to their platform due to Section 230. It may seem counterintuitive, but these protections make it possible to exist any online moderation.
Before Section 230, ISPs CompuServe and Prodigy faced lawsuits over their policies of moderating what users post on their servers. The former company said it would not moderate any content, while Prodigy said it would. Both were sued, and Prodigy was found liable for content posted on its servers, even though it had a moderation policy.
McKee said the law was created to allow services to decide what type of speech to allow on their platform and to provide protections while enforcing those policies. This raises the bar for civil lawsuits over messages sent between users.
There also appears to be protection for general promises about the safety of children in Roblox and Discord. “There are cases in which the plaintiffs have tried to make this claim, namely that they do not seek to retain [platforms] are responsible for the content of the message, but for statements about what they will do to protect users,” he said. “These cases were not successful.”
Courts have also held that Section 230 provides immunity from lawsuits related to the account creation process. “The courts ruled that 230 applied because the decision to provide public accounts was inherently related to the account holders' ability to create, view and share content on the service,” McKee said. “A legal action that seeks to change or limit a service's ability to carry out its desired account creation process would affect 230 because it necessarily seeks to impose liability based on third-party content on the site.
Successful cases were based on specific promises made by online platforms to specific users. Mackey recalled a case before the Ninth Circuit about a user who experienced online abuse, sought help from the platform owner, was promised help, and then the company took action. The court held that Section 230 did not apply to this case because it related to the service's failure to fulfill its promises.
How can online platforms improve children's safety?
It's tempting to view Section 230 as an obstacle to holding online platforms accountable for user safety, but there are many policy gaps that have led to this difficult status quo. Law enforcement agencies have been slow to respond to all types of online threats. Closed ecosystems or Roblox and Discord do not allow other companies to offer third-party security tools to parents. And laws created around online “child safety” have come under fire for their potential block all types of unwanted speech.
Pair it with global retreat V online moderation And you create a porous online ecosystem that stops some predators but allows others to escape. “The general industry trend of reducing moderation would be a disgusting excuse for putting children in danger,” Walsh told Game Developer.
“Other companies have successfully implemented common-sense security mechanisms such as ID verification, mandatory parental approval by default, and strong deterrents to prevent messaging between children and adults. Corporations that market themselves as child-friendly have an undeniable responsibility to prioritize the safety of children.”
When reached for comment, a Discord spokesperson declined to discuss the specifics of the cases or whether they planned to invoke Section 230 in their defense. “We use a combination of advanced technology and trained security teams to proactively find and remove content that violates our policies,” they said.
Roblox Corp. did not respond to multiple requests for comment.






