Gain insight into 'toxicity' in your experience with safety analytics!

Hi Creators,

Today, we’re thrilled to introduce the new Safety Analytics dashboard. For the first time, creators will have access to insights on the frequency of abuse reports generated within their experiences. This will help you understand how your experience compares to others, and measure the impact of changes that you make.

You can find the Safety Dashboard in your Creator Dashboard for your experiences under Safety > Overview. The dashboard provides a top-level view of user-submitted abuse reports within your experience.

Here’s a look at what’s new:

1. Understand Abuse Report Submitters per 1,000 Playtime Hours

This chart shows the rate of abuse reports in your experience, normalized per 1,000 hours of playtime. You can see this chart if you have 1000+ daily playtime hours in the past week.

  • How to read it: This metric allows you to monitor the frequency of abuse reports related to your experience (includes all reports submitted in-game and experience reports submitted from experience details page). If you see an increase in this number, this means that more abuse reports are being generated in your experience and this can be an early indicator of growing toxicity. You may need to investigate further (e.g. Did reports spike when you introduced a new feature?) and take action before it becomes a larger issue. The 90th percentile benchmark is provided as a point of comparison. If your experience is above this value, it is in the top 10% of experiences receiving the most abuse reports.

For example, a spike in abuse reports on February 2nd, after the launch of your new custom avatar editors, could indicate that users are misusing the feature or creating combinations that violate community standards. Potential solutions include rolling back the new functionality, increasing moderator support, or providing more proactive education on policies and community standards.

Please note, this metric is primarily for your information. Roblox always reviews each abuse report on its own merit. However, if the abuse in your experience becomes pervasive, we may ask you to make changes to improve safety within your experience.

2. Pinpoint negative behavior with abuse reports category breakdown

To help you take more targeted action, this chart breaks down abuse reports by category. The “Other” category groups all categories outside the top 9. These are the categories selected by users and reflect their understanding of each category. Our moderation teams confirm both the category and whether the content violates our policies before taking action on an abuse report.

  • How to read it: This breakdown helps you pinpoint the general types of negative behavior occurring in your experience so you can take more targeted action. For example, a high percentage in the “Romance or sexual” category might prompt you to review your in-game avatar editor tools or in-game chat systems. Or if you see a high percentage in the “Cheating” category for an Obby game, you might want to design mandatory checkpoints before granting rewards or add custom events to flag players who have extreme capabilities.

3. Filter by channel for targeted insights

You can filter both the abuse reports trend chart, as well as the abuse reports category chart by channel to isolate reports related to a specific part of your experience:

  • Experience: Direct reports about the experience itself (e.g., inappropriate content).
  • Chat: Reports related to in-experience text chat.
  • Voice: Reports related to in-experience voice chat.
  • Audio: Reports related to audio assets used in the experience.
  • Avatar: Reports related to user avatars, clothing, or accessories.

When you apply a channel filter, the benchmark will also update to show you a comparison relevant to that channel.

4. Stay proactive with automated safety insights

To help you stay ahead of potential issues, the dashboard will automatically show an insight if your rate of abuse report submitters rises above the 90th percentile benchmark, either for your overall experience or for a specific channel.

5. New safety documentation for creator tools

We compiled new Safety documentation with an overview of safety tools you can use to maintain a safe environment in your experience. Here are few tools you can use to improve moderation:

Manage Player Behavior:

  • Ban API: Permanently remove disruptive users and help detect their alternate accounts.
  • Kick API: Temporarily remove abusive users from a server.
  • IsVerified API: Limit certain features (like ranked play or trading) to identity-verified users to discourage bad behavior.

Filter All User Text:

  • Use TextService to filter all non-chat player-generated text, such as signs, or pet names, to block inappropriate content and personal information.
  • Use TextChatService to deliver messages for communication features.

Ensure Policy Compliance:

  • Use PolicyService to adapt your experience based on a player’s region or platform policies. You can use this for things like advertisements, paid random items (loot boxes), and social media links.

We are continuing to develop new analytics for violating scenes and a new moderation API to help you moderate UGC tools. Let us know if you have any questions in the comments below!

207 Likes

This topic was automatically opened after 11 minutes.

Great feature, but I’m incredibly worried this can be used to take down games by people spam reporting users inside the game.

Are there any preventions for this?

TO CLARIFY: I meant that could someone get alot of alternate accounts and spam report users in-game to get the stat increased?

41 Likes

y’all late I saw it in my analytics earlier

5 Likes

Great update, this information is really important for fostering a healthy and friendly community.

I’m sad to see there’s reports for bullying in my game as the no. 1 issue :frowning:


Time to add anti-bullying messaging in-game :innocent:

Edit: turns out the amount of reports in my game are significantly lower than the benchmark. Phew.

29 Likes

Thank you for this, but I would strongly recommend re-thinking the ‘Other’ category. It’s way too vague for us to determine what problem could possibly be happening and makes it difficult to resolve the problem if this category is widely used by our players to submit reports.

13 Likes

This also prevents developers from claiming ignorance of their community, which is a good thing

29 Likes

Are there any considerations for us to potentially be able to receive abuse reports that might actually be relevant to our games in particular like reports from the Cheating or Scam categories in the future? That would make these analytics even more valuable since players are probably more inclined to report cheaters or scammers through that then our own provided means.

15 Likes

I tried rereading I don’t think this is relevant to reporting the experience at all?

This is correlating to In-game reports. Yknow when you press Esc and then report an individual inside that game

3 Likes

These could also be reports of spam or other issues (there is quite literally no “Other” reporting category? No clue where most of this data is coming from unless they’re reports that have already had action taken.)

People tend to miscategorize violations that don’t have a category and “Bullying” is just so vague most people would pick that.

5 Likes

Can we have the ability to block content from our games? In my experience the #1 inappropriate thing I see in games is another user’s avatar using items that come from the Roblox catalog. There’s a lot of stuff that somehow passes moderation and makes its way into our games.

There are a lot of complaints about the bans API not detecting alts well. Banning legitimate players teaches them a lesson. Banning trolls will only send some of them to an alt account. If I ban somebody for wearing something from the Roblox catalog that is inappropriate, I want to also be able to effectively ban that item from the game too. Nobody could wear it, it would no longer appear in any engine API calls like the catalog API. But right now this is a huge problem with no good, solid solution if we don’t run our own web server to deal with it.

TL;dr I think one of the most powerful tools we could have to be proactive is to be able to block content ourselves instead of needing to wait for Roblox to take down the content.

8 Likes

Мне не кажеться что это слишком жестоко для некоторых людей я понимаю что это безопасно и всё такое но любовные отношения это как бы сказать с них и начинается дружелюбные отношение интерес играть и тд. да маты запрещены и всё такое но я надеюсь что Roblox будет идти только в лучшую сторону. Для обычных игроков видеть постоянные решотки в чате это как то ненормально. Извините если я несу бред.

5 Likes

“Other” should be the combination of categories that doesn’t come up in top 9 for your experience. Could you share your experience id if other is a top category for you?

4 Likes

I have a question about this that I and a lot of people in my communities have been wondering for a long time. Does reporting someone for cheating actually do anything? Or is it just there to help stop players from reporting exploiters in random categories that they might choose if it weren’t an option? I’ve seen people get moderated for being reporting for text and avatar infractions, but I have never seen a report for cheating ever work. I always tell my community members to report any exploiters they see for cheating just in case it does do something, but if your game having a lot of reports could now potentially affect its place in the algorithm, I don’t know if I should keep telling people to do that.

I also agree with what some people have said here so far. Allowing game owners to see some of the reports might be useful. For example, if someone gets reported a bunch of times for cheating in a short period of time, maybe a game developer could be given that information to watch out for that particular player.

3 Likes

I feel that reports of cheating and scamming at least should always be passed along to the game’s developers and/or moderators because those are categories that are much more likely to contain reports specific to our own games that Roblox won’t be able to action on (like a cheat might actually be a player abusing a bug in our game, or a scam might be via our own in-game features)

13 Likes

Just make it so we can see we can see the number of reports for each category, because in increase in reports doesn’t always mean an increase in toxicity.

3 Likes

You can do that by clicking Explore and then Breakdown by=Category

4 Likes

idk about y’all but i am very curious to see forsaken’s ‘toxicity’ analytics

18 Likes

thanks :+1: i didnt read the whole post

3 Likes

Sorry, what I mean was that could someone in theory get many alternate accounts and start reporting each other in game by saying harmful stuff? This can be easily done on small experiences & roblox also has a botting problem.

4 Likes