Roblox has declared its intention to prevent users under the age of 13 from sending messages to others on its online gaming platform. This measure is part of new initiatives aimed at protecting children. By default, child users will be unable to send direct messages within games unless a verified parent or guardian grants them authorization. Furthermore, parents will gain the ability to oversee and manage their child’s account, which includes viewing their online friends list and establishing daily limits on their playtime. Research by Ofcom indicates that Roblox stands as the most favored gaming platform for children aged eight to 12 in the UK. However, the platform has faced calls to enhance the safety of its experiences for younger users. The company stated that these modifications would commence rollout on Monday and are expected to be fully in effect by the close of March 2025. This implies that young children will retain access to public conversations visible to all within games, allowing them to communicate with friends, but will be restricted from private conversations without parental approval. Matt Kaufman, Roblox’s chief safety officer, noted that 88 million individuals play the game daily, and more than 10% of its total workforce, which amounts to thousands of employees, are dedicated to the platform’s safety features. He commented, “As our platform has grown in scale, we have always recognised that our approach to safety must evolve with it.” In addition to prohibiting children from sending direct messages (DMs) across the platform, Roblox will provide parents with enhanced methods to monitor and control their child’s activities. To access parental permissions for their child through their linked account, parents and guardians are required to verify their identity and age using a government-issued ID or a credit card. However, Mr. Kaufman conceded that identity verification presents a challenge for numerous technology companies. He urged parents to ensure that a child’s account reflects their accurate age. He stated, “Our goal is to keep all users safe, no matter what age they are.” He further added, “We encourage parents to be working with their kids to create accounts and hopefully ensure that their kids are using their accurate age when they sign up.” Richard Collard, associate head of policy for child safety online at the UK children’s charity the NSPCC, described the modifications as “a positive step in the right direction.” Nevertheless, he emphasized that these changes necessitate support from effective methods of user age verification and checking to “translate into safer experiences for children.” He further asserted, “Roblox must make this a priority to robustly tackle the harm taking place on their site and protect young children.” Roblox additionally declared its intention to streamline content descriptions available on the platform. The platform is substituting age recommendations for specific games and experiences with “content labels” that plainly describe the game’s nature. This approach, according to Roblox, enables parents to base decisions on their child’s maturity instead of their chronological age. These labels span from “minimal,” which might involve infrequent mild violence or fear, to “restricted,” potentially encompassing more mature content like strong violence, language, or significant realistic blood. As a default setting, Roblox users younger than nine years old will only have access to “minimal” or “mild” experiences. However, parents possess the option to permit them to play “moderate” games by providing consent. Conversely, users are barred from accessing “restricted” games until they reach a minimum age of 17 and have utilized the platform’s age verification tools. This announcement comes after a November declaration by Roblox that it would prohibit users under 13 from “social hangouts,” where players engage in communication via text or voice messages, starting from Monday. Additionally, developers were informed that as of 3 December, Roblox game creators must indicate whether their games are appropriate for children and will block games for users under 13 that fail to provide this information. These modifications are being introduced as platforms utilized by children in the UK prepare to comply with new regulations concerning illegal and harmful content on their services, mandated by the Online Safety Act. Ofcom, the UK regulatory body responsible for enforcing this legislation, has issued a warning that companies will incur penalties if they do not ensure the safety of children on their platforms. The watchdog is set to release its codes of practice for companies to follow in December.

Leave a Reply

Your email address will not be published. Required fields are marked *