Roblox Rolls Out Age-Verification Requirement for Chat
Digest more
A mother is suing Roblox, a popular online platform, with the hope of preventing other parents from experiencing tragedy - a child’s suicide.
The new safety measure comes as Roblox faces criticism and at least 35 lawsuits alleging users meet and abuse children on the gaming platform or that the platform helps facilitate child sexual exploitation and grooming.
Roblox, under scrutiny for child safety concerns, will require users to verify their age through a selfie or an ID. They'll only be allowed to chat with others in the same age group unless they confirm they know the other person.
FIRST ON FOX: Kentucky brought a civil lawsuit on Monday against Roblox alleging the massive gaming platform is not safe for children, joining a growing list of entities who have sued over similar allegations. Attorney General Russell Coleman alleged in ...
"It was the first time that a court ... analyzed the application of ending forced arbitration in a situation like this," said Pat Huyett, a partner at Anapol Weiss who is representing plaintiffs in litigation against Roblox.
The online gaming platform “Roblox” will begin using AI technology to scan users’ faces for age verification. Roblox's senior director of product policy, Eliza Jacobs, spoke with NBC News’ Tom Llamas about the new feature aimed at preventing unverified users from accessing the chat,