The Roblox logo was prominently displayed on a banner in New York. This image, captured by Reuters, serves as a visual reminder of the company's presence in the gaming industry.
On Monday, Roblox, the renowned video game maker, announced the implementation of new safety measures aimed at safeguarding users under the age of 13. These measures will prevent minors from sending direct messages to others without obtaining parental consent. The gaming platform, which boasted approximately 89 million users in the last quarter, is now enabling parents and caregivers to remotely oversee their child's Roblox account. This includes the ability to view friend lists, set spending limits, and manage screen time.
Roblox has been the subject of allegations regarding child abuse on its platform. In August, Turkey temporarily blocked access to Roblox following a court order, as prosecutors probed concerns about user-generated content potentially leading to abuse. A lawsuit filed in San Francisco in 2022 accused Roblox of facilitating the sexual and financial exploitation of a California girl by adult men. The lawsuit alleged that the platform encouraged the girl to engage in harmful behaviors, including drinking, abusing prescription drugs, and sharing explicit photos.
In response to these concerns, Roblox has introduced a built-in setting that restricts users under 13 to accessing public broadcast messages only within games or experiences. Additionally, the company is replacing age-based content labels with descriptors ranging from 'Minimal' to 'Restricted,' indicating the type of content users can expect. By default, users under nine can only access games labeled 'Minimal' or 'Mild.' These new restrictions will also prevent users under 13 from searching, discovering, or playing unlabeled experiences. Restricted content will remain inaccessible until a user is at least 17 years old and has verified their age.
Source link: https://www.khaleejtimes.com