mtoto.news

Roblox Introduces Stricter Safeguards for Children Under 13

November 19, 2024

Roblox, a popular online gaming platform, has announced measures to enhance safety for children under 13. As part of its new child protection efforts, users in this age group will be blocked from sending direct messages (DMs) unless a verified parent or guardian provides permission.

The changes, which will be rolled out from Monday and fully implemented by March 2025, also include features allowing parents to manage their child’s account. These tools will enable parents to monitor online friends, set daily playtime limits, and control the type of content their children can access.

Under the updated rules, children will still be able to participate in public in-game conversations but will be restricted from having private chats without parental consent. The aim is to create a safer environment while preserving the social aspect of the platform.

Matt Kaufman, Roblox’s Chief Safety Officer, highlighted the platform’s commitment to safety. “As our platform has grown in scale, we have always recognized that our approach to safety must evolve with it,” he said. With 88 million daily users, Roblox has allocated over 10% of its workforce to safety initiatives.

The platform is introducing a redesigned parental dashboard, giving parents better oversight of their child’s activities. Parents must verify their identity using government-issued ID or credit card information to activate these permissions.

Mr. Kaufman emphasized the importance of accurate age information during account creation, urging parents to work with their children to ensure compliance.

Richard Collard, associate head of policy for child safety online at the NSPCC, described the changes as “a positive step in the right direction.” However, he stressed the need for robust age verification to ensure the effectiveness of these measures.

Roblox is also replacing age recommendations for games with detailed “content labels” that outline the nature of the experience. These labels range from “minimal,” which may include mild violence or fear, to “restricted,” which could feature strong violence or mature themes.

By default, children under nine will only access “minimal” or “mild” content, while parental consent is required for “moderate” games. “Restricted” content remains inaccessible until users are at least 17 years old and have verified their age.

In addition to these changes, Roblox announced earlier that it would block under-13s from accessing “social hangouts,” where text and voice messaging are prominent. Starting December 3, developers will also be required to label their games as suitable or unsuitable for children under 13, with restrictions applied to non-compliant games.

These updates align with the UK’s upcoming Online Safety Act, which mandates stricter protections against illegal and harmful material on platforms accessed by children. Ofcom, the UK regulator, will begin enforcing these rules, with codes of practice set to be published in December.

The moves represent Roblox’s effort to address growing concerns about child safety while maintaining its position as a leading platform for young gamers.

BBC

Hey, like this? Why not share it with a buddy?

Leave a Reply

Your email address will not be published. Required fields are marked *