Under outside pressure, Roblox announced updates to its safety systems and parental controls today to protect children.
In a blog post, Matt Kaufman, chief safety officer at Roblox said the updates will better protect the platform’s youngest users and provide easy-to-use tools to give parents and caregivers more control and clarity over what their children do on Roblox.
These moves are coming as Roblox came under fire in different media reports, including a Bloomberg story that pointed out “Roblox’s pedophile problem.” Safety is a very big topic for every game company these days. Of course, this is not easy to do because Roblox has 90 million daily active users across 190 countries. There are six million games on the platform.
The company is adjusting built-in limits around how children under age 13 can communicate. And parents can now access parental controls from their own devices rather than from their child’s device and monitor their child’s screen time.
“Safety is and always has been foundational to everything we do at Roblox. We’ve spent nearlytwo decades building strong safety systems, but we are always evolving our systems as newtechnology becomes available,” Kaufman said. “We regularly ship updates to our safety and policy systems. We’ve already shipped more than 30 improvements this year.”
Today, Roblox made changes to parental controls, changes to how users under age 13 can communicate on Roblox, new content labels, and additional built-in protections for younger users. Some of the policy changes have been in the works for more than a year.
These changes were developed and implemented after multiple rounds of internal research, including interviews, usability studies, and international surveys with parents and kids, and consultation with experts from child safety and media literacy organizations.
Dina Lamdany, product manager at Roblox, said in a press briefing that the changes were based on Roblox’s own internal user research as well as consultation with external experts.
Roblox is making these changes for its youngest users by: (1) making it easier and more intuitive for parents to manage their child’s settings, and (2) updating built-in limits to provide certain protections,independent of parental controls. These changes have been supported by partners, including the National Association for Media Literacy Education (NAMLE) and the Family Online Safety Institute (FOSI).
Stephen Balkam, CEO of FOSI, said in a statement, “FOSI applauds Roblox’s ongoing efforts to prioritizethe safety and well-being of its youngest users. By empowering parents with new controls that allow them to oversee their child’s activity in a flexible, meaningful way, Roblox is taking significant steps toward building a safer digital environment.”
Enabling remote parental controls
Roblox already provides parental controls, including spend limits. But those have been managed from the child’s account. Today the company is launching remote management, which allows parents and caregivers to adjust controls and review their child’s activity even if they aren’t physically together.
Parents who want to be more involved in monitoring their child’s activities can link their Roblox account to their child’s account—after verifying themselves using an ID or credit card. After linking accounts, parents can manage their child’s experience and access from Parental Controls.
Roblox is also enabling friends list and screen-time monitoring. Part of the long-term vision is to give parents granular controls to monitor and limit how much time their child spends on Roblox.
In the Parental Controls dashboard, parents can now see their child’s average screen time over the past week, as well as their child’s friends list. Parents and caregivers can also set daily screen-time limits. Once the limit is reached, their child cannot access Roblox until the next day.
Roblox also has an updated Parent and Caregiver Guide.
Built-In restrictions for communications
Connecting with others is core to the Roblox experience. But Kaufman said the company wants to facilitate that with safety in mind. Over the next few months, Roblox is changing how children under age 13 can communicate on the platform.
As a reminder, the built-in filters and moderation rules for communication still apply for any chat features and for users of all ages, including those 13 and older.
Now, users under the age of 13 will no longer be able to directly message others on Roblox outside of games or experiences (also known as platform chat).
In addition, Roblox is introducing a built-in setting that will limit users under age 13 to public broadcast messages only within a game or experience. By default, users younger than 13 will not be able to directly message others. Parents can change this setting in Parental Controls.
The company is constantly evolving and innovating safety systems.
“We are always working to make chat incredibly safe and are exploring new ways for users of all ages to communicate and interact safely on Roblox,” Kaufman said. “Many of these updates are launching today. For others, Roblox is actively working with the creator community to implement updates across all experiences. Roblox expects all these changes to be implemented by the first quarter of 2025.
Content maturity limits
With content labels, Roblox will work closely with kids and parents to understand their knowledge ofRoblox’s platform; the information and controls they are looking for; and the concerns they havearound safety, engagement, and communication on the platform.
Children develop on different timelines and, from both Roblox’s own research and external research, the company knows that parents have different comfort levels regarding the type of content their child engages with. Labeling experiences based purely on age does not respect the diverse expectations different families have.
Today, Roblox is launching simplified descriptions of the types of content available. Experience Guidelines will be renamed Content Labels, and Roblox will no longer label experiences by age. Instead, it will label experiences based on the type of content users can expect in an experience.
These updates should provide parents greater clarity to make informed decisions about what is appropriate for their child.
Roblox has also updated its built-in maturity settings for the youngest users. Users under nine can now only access “Minimal” or “Mild” content by default and can access “Moderate” content only with parental consent. Parents still have the option to select the level they feel is most appropriate for their child.
And there will be age gating. Traditional rating systems apply to content, but don’t consider user behavior when determining an age rating. Roblox will take a more restrictive approach for younger users and the company is now age-gating certain experiences for users under age 13, based on the type ofuser behaviors sometimes found in those experiences.
These new restrictions apply to experiences primarily designed for socializing with users outside of their friends list and experiences that allow free-form writing or drawing, such as on a chalkboard or a whiteboard or with spray paint.
Age-based settings: Roblox adapts as children grow up, so a child’s built-in account settings will automatically update as they move from one age group to another. The company wants parents and children to have an opportunity to discuss their current Roblox usage, what features are appropriate going forward, and whether to make any updates to the built-in settings.
To facilitate those conversations, Roblox will notify the child and linked parents about upcoming changes to the child’s age-based settings 30 days before the changes go into effect.
Continuing to prioritize safety
When Roblox designs new products, it is are mindful of several important factors. The company is fundamentally a platform for play, which differs from other places on the internet, where the focus is on browsing or consuming content. Since launch day, Roblox has had a growing population of younger users and the company want to help keep them safe on Roblox.
“We take safety extremely seriously,” Kaufman said.
The company said it is grateful for the contributions and support it has received from child safety and media literacy organizations and child development experts. These experts provided input, reviewedupdates, and shared perspectives that helped us make these controls as useful as possible for both parents and kids.
Executive Director of NAMLE Michelle Ciulla Lipkin said in a statement, “As media literacy experts, wecommend efforts to improve safeguards, mitigate risk, and give parents and kids an opportunity to engage together around important topics like privacy and security. NAMLE is proud to partner with Roblox and support their commitment to making the online space safer and more civil for young people.”
If a child wants to join an experience and sees a lock, that means they need parental permission. They can ask for that permission to be granted. The parents receive an email on their own device telling them they have a request from their child. They can review the information about the experience and approve it or not.
For the sake of privacy, Roblox does not require users to have a government ID to submit a private password. But it does require a government ID to access certain features, like restricted content or content where you might need to use your phone. Parents, caregivers or guardians are required to provide government ID showing that they’re a relative or to provide a credit car authorization, Lamdany said.
For verification, Roblox requires a live selfie and it uses third-party vendors who provide identification technology.
“While there is no ‘perfect’ when it comes to online safety, our overall approach is systematic and thoughtful,” Kaufman said. “We regularly update our policies and systems to help keep children safe onRoblox—regardless of whether parents elect to use our parental controls. Our goal is to make Roblox the safest and most civil online platform possible because it is the right thing for children, their parents and caregivers, our investors, and our company.”
A big investment in human capital
Roblox said that 10% of its full-time employees — a couple of thousand of people around the world work on safety, Kaufman said in a press briefing.
“Like 20 years ago, safety remains our number one priority,” Kaufman said. “We are dedicated to building safety systems that keep all of our users safe and make sure that all of the experiences that are available on the platform conform to our policies.”
“We don’t believe that the number of moderators is commensurate with the quality of the moderation that happens on the platform,” Kaufman said. “We utilize industry-leading machine learning and AI to do a significant amount of moderation automatically. We do this so that it can happen in real time and automatically scaled during the day as the number of users increase.”
Human moderators are focused on appeals from the users and handling the most complex questions, which often are routed to investigators who spend more time digging into the details.
“The platform is primarily used by kids and that has informed how we have developed policies,” Kaufman said. “The centerpiece of our policies is our community standards. They govern what is allowed on the platform and what is appropriate for users of different ages. We believe the community standards are some of the strictest policies in the industry and the foundation of these policies is really from the beginnings of Roblox where there was primarily kids on the platform.”
For example, the policies prohibit profanity everywhere except for when users are verified to be over 17. They prohibit depictions of tobacco and pharmaceuticals, and they prohibit references or depictions of drunkenness. Further, romantic or flirtatious gestures between users is also prohibited on the platform.
All of the safety systems are built around these policies.
“Keeping our users safe requires a multi-tiered approach to safety. The first tier is our community standards and policies,” Kaufman said. “These address both safety concerns and content maturity. What content should be available to which users based on their age and their parents’ settings. All content submitted to Roblox goes through automated moderation. This moderation looks at images, videos, audio files, and 3D models.”
Roblox identifies problematic content before it is exposed to other users, and it immediately removes it from the platform and addresses the issue with the users who submitted that content.
To prevent predators from being alone with kids, Roblox does not encrypt any communication between users, no matter what their age is.
“And we have automated systems that automatically identify violations in our community standards and take actions accordingly,” Kaufman said. “Evidence of any critical harm on the platform is immediately escalated to our team of investigators to review and action accordingly. We also filter inappropriate content from text communication. We use industry-leading automated filters across many languages and this is essential for blocking exposure to violative behavior to our youngest users.”
Kaufman added, “Our text filters are also specifically designed to block sharing of personally identifiable information. And this includes attempts to take conversations off Roblox where safety standards and moderation systems are less stringent than what we expect on our own platform.”
And finally, Roblox does not allow users to exchange images or videos in chat.
The post Roblox updates its safety systems and parental controls under outside pressure appeared first on Venture Beat.