The company that owns and operates Roblox is facing an extremely serious lawsuit that alleges the game platform failed to keep a young girl safe from predators who led her down a path of exploitation and manipulation.
The suit, which was filed in the jurisdiction of the San Francisco County Superior Court, identifies the now 13 year-old girl simply as S.U. and claims she came into contact with some adult men in Roblox in early 2020; the men then encouraged her to sign up to Discord, Snapchat, and Instagram, through which she was made to drink, take prescription drugs, send sexually explicit photos of herself, and even send money to one of the men. The suit claims that as a result of this, S.U. suffered from mental health problems that led to multiple suicide attempts and hospitalization.
In addition to Roblox, the suit also weaves in Discord Inc., Snapchat parent company Snap inc., and Instagram parent Meta Platforms Inc. Both S.U. and her mother claim that Roblox failed to take steps to keep minors on the game platform safe, while Snapchat, Discord, and Instagram didn’t take appropriate steps to verify S.U.’s age.
When asked for comment by reporters, Meta declined to comment, Roblox and Snap were not able to be reached for comment before the story went to print, and a Discord spokesperson simply noted it has a “zero-tolerance policy for anyone who endangers or sexualizes children” but wouldn’t comment on the case itself. The suit seeks an unspecified amount in damages.
Roblox being called out for its abysmal lack of managing abusive content and behavior is not a surprising revelation to those who have been following along; our prior reporting outlines some of this title’s deep black marks. That said, this is also a symptom of the larger problems with other online platforms as well, such as Twitch. Ultimately, it may just take lawsuits like this one to finally force platforms to actually – and actively – police their spaces.