Moderation content pre-screening service

Background:
One of the greatest fears of Roblox devs is moderation, but it doesn’t need to be that way. When we hear complaints about moderation it’s almost always after the punishment. A moderation action represents damage on all sides. The dev has already damaged the Roblox community by breaking a rule, the mod has damaged the dev, and players who grew attached to the moderated game/assets are caught in the crossfire. If we can find a way to prevent these rule infractions before they ever happen, no one gets hurt and everyone (players, devs, roblox corporation) wins. Prevention is always better than the cure.

Feature:
A service where devs can send content to the moderation team to be pre-screened, and pay a fee based on how long they want moderation to review it. This could include images, text, audio or video. If the review finds a rule infraction, the dev is educated on what needs to be changed to protect the Roblox community. If moderation had ample time to fully review the material (the dev paid for enough time), and no infractions were found, the content becomes “approved content”. If, at a later date, moderation changes their minds and wants the approved content taken down, they could demand that the dev do so within 48 hours, but the dev’s account would be safe from bans regarding approved content if they comply within the 48 hour window.

Further clarification:
Q: What if the dev pays for more time than is actually used?
A: They get refunded for the time that wasn’t used.
Q: Devs can already be moderated for decals, etc. that were approved on upload. Why would they deserve better treatment for content approved in this way?
A: The dev has gone out of their way to cooperate with the moderation team and through their payment and willingness to be thoroughly examined by a human, they have proven their good intentions. A bad actor would not use this service (it would be a waste of their money).
Q: Why wouldn’t this be free?
A: Mods have a limited amount of time. It’s a similar case to audio uploads, which have fees based on how long they are (how long it takes to review them).

Examples of content that might be sent for review in this way:

  • Is this blood effect too extreme?
  • Is this animation “overly violent”?
  • Is this dialog discriminatory?
  • Is this skirt too short?

Notice that these examples have to do with subjective qualities that exist on a continuum. There is no rule stating “a skirt cannot be below X pixels”, nor should there be one. However, a reasonable person looking at the rules would not know where to draw the line. The only people who know are the mods themselves which is why a line of communication like the one suggested here would be a great educational resource for devs.

26 Likes

You could read that Terms & Service thing that you agree to when creating an account. This way, before you upload, you would know what falls under acceptable or not. A lot of people out there do not read the ToS, privacy policy, etc and wonder why they got moderated.

Pre-screening would never actually scale correctly and we’d be given an automated solution with false-positives everywhere. I don’t know if I want to settle for that as a developer on the platform.

1 Like