As a developer, I find that I have a lot players looking to report Roblox rule breaking behaviour from other users that they witness in my experience. They send this information to me, the developer, instead of utilizing the Roblox report feature. This is because there is a widespread belief amongst users that reporting to Roblox falls on deaf ears and that the Roblox moderation team doesn’t do anything.
I don’t think Roblox moderation is the best by any means, but at the same time, it is really disheartening to see that there is such poor faith in the moderation system that users don’t even engage with it and instead seek the developer thinking that we are more likely to act upon it, when we don’t even have access to platform level moderation.
(I’m going to go out on a limb and have faith that user reports can actually lead to moderation action by the Roblox team in the context of the post.)
Reports feel like they don’t do anything.
There is absolutely zero positive feedback for reporting a user in the current system. There is no way to know if a report has actually done anything, which feels really really bad. Given that instant gratification is impossible (the report isn’t going to get them instantly removed without going through moderation team first), not having any delayed gratification means that reports feel like they do absolutely nothing, like shooting into the void as they will usually never know if their actions (taking the time to file a report) actually led to any real outcomes.
I am suggesting that users get notified, e.g. a system message through their private messages, when the subject of their report gets moderated (i.e. delayed positive gratification/reward to reinforce positive engagement with the system).
Some games (e.g. League of Legends) already have this feature as shown below. It can simply state “Your recent report has been verified. The violating user have been moderated. Thanks for the report.” Doesn’t need any details. The same, short message can be sent to all users who recently reported the moderated user/content.
All I am asking is be a way for users get occasional positive feedback for recent reports of rule-breaking behaviour. Not often, but just enough that users know their reports actually do something.