Roblox, Age and Child Safety
Digest more
The new safety measure comes as Roblox faces criticism and at least 35 lawsuits alleging users meet and abuse children on the gaming platform or that the platform helps facilitate child sexual exploitation and grooming.
Roblox will soon require that users go through an age estimation process if they want to be able to chat with other people on the platform. The requirement will be in effect for all users starting in early January.
Roblox CEO David Baszucki joined CNN’s Pamela Brown to discuss new safety measures to protect young users, including AI-age verification technology. The announcement comes after a growing number of lawsuits that say the platform allegedly failed to deter sexual predators from targeting minors on the popular gaming platform.
Roblox said it is "disappointed" that it is being sued based on "misrepresentations and sensationalised claims".
Roblox has just been hit with another lawsuit over its child safety issues. Texas Attorney General Ken Paxton announced on social media on Thursday that the state was suing Roblox for "putting pixel pedophiles and profits over the safety of Texas children."
A new series of age checks and chat groupings are coming to Roblox that are meant to keep children from falling prey to predators.
Texas AG Ken Paxton said the children have been “repeatedly exposed to sexually explicit content, exploitation and grooming” on the online game platform.
GameRant on MSN
Roblox Taking Steps to Block Children from Talking to Adults
Looking to combat continued controversies over child safety, Roblox is implementing new steps to create a safer gaming environment.