Roblox Boosts Child-Safety Settings by Auto-Blocking DMs to Users Under 13, Increases Parental Controls on Violent, Sexually Themed Content

Roblox, a popular online gaming platform known for its user-generated content, has recently come under fire for its child-safety protections. In response, the company has announced several new measures aimed at enhancing the safety and well-being of its younger users.

One of the most significant changes is the automatic blocking of direct messaging (DMs) for users under the age of 13. This move is intended to prevent young children from receiving inappropriate or harmful messages from other players. By default, users under 13 will no longer be able to send or receive DMs, reducing their exposure to potentially harmful content.

In addition to this measure, Roblox is also increasing parental controls on violent and sexually themed content. Parents will now have the ability to regulate the level of “mature” content available to their kids, regardless of age. This means that parents can limit their children’s access to certain types of content, such as violence, strong language, or suggestive themes, depending on their individual preferences and values.

These changes are a response to growing concerns about the safety of children on online gaming platforms like Roblox. The company has faced criticism in recent months for its handling of child-safety issues, with some experts arguing that the platform’s default settings put young users at risk of exposure to inappropriate content and interactions.

Roblox’s decision to auto-block DMs for under-13s is a significant step towards addressing these concerns. By preventing young children from receiving unsolicited messages, the platform can help reduce the risk of grooming, harassment, or other forms of exploitation. This move brings Roblox in line with other popular online platforms, such as social media sites and messaging apps, which have long had similar restrictions in place.

The increased parental controls on violent and sexually themed content are also a welcome addition to the platform. By giving parents more power to regulate their children’s exposure to certain types of content, Roblox is helping to ensure that young users have a safer and more age-appropriate experience.

It’s worth noting that these changes are just the latest in a series of efforts by Roblox to enhance child-safety protections on its platform. In recent years, the company has implemented a number of measures aimed at reducing the risk of harmful interactions and content, including the use of machine learning algorithms to detect and remove inappropriate content, as well as partnerships with experts in child online safety.

Despite these efforts, some critics argue that Roblox still has work to do in order to ensure the safety and well-being of its youngest users. For example, some have called for more robust moderation of user-generated content, as well as better tools for parents to monitor and control their children’s online activities.

Overall, however, Roblox’s latest changes represent a positive step towards creating a safer and more child-friendly environment on its platform. By taking proactive steps to address concerns about child safety, the company is demonstrating its commitment to protecting the well-being of its users – both young and old. As online gaming platforms continue to evolve and grow in popularity, it’s increasingly important for companies like Roblox to prioritize the safety and well-being of their users. With these latest changes, Roblox is helping to set a new standard for child-safety protections in the online gaming industry.

_config.yml