As a Roblox developer, I am unable to keep up with the growing moderation needs of my game. A lot of games allow for players to freely build their house, create an avatar and generally create a variety of unique UGC content. This opens a lot of creativity but also poses a great challenge in what to do with bad actors. There are no methods of handling in-game moderation for content of this type, and it is an increasingly growing concern.
If Roblox were to solve this problem, anybody making a UGC-based game would have sufficient tools to implement their own automatic moderation for in-game creations.
The upcoming Open Cloud asset uploading APIs have a future in assisting developers in self-moderating their games. An example I can think of is uploading a model of a user’s creation, and if the model gets deleted then take action against that creator.
Good example, however, knowing Roblox and how the old “Save instance” service used to go, it would probably be scuffed and get the game creator banned …
Definitely can support this. For ERLC, we had to implement manual approval for custom liveries we wanted users to upload in their private server. Obviously this takes manpower that would come at our expense, so if a solution can be made that would be insanely helpful.
Thank you for bringing this up @Usering
We’re actively exploring and building moderation tooling to keep Roblox a safe and civil platform. Feel free to mention any tools or APIs that could help you manage moderation efforts for your experience!
Hey! I’m currently struggling with how I should approach content moderation in my experience.
My current project will let players build little self-contained objects in a reserved server to use publicly for everyone to see. (It’s a reimagining of this game I made as a kid, if you want more context.)
Moderation is a huge concern of mine since the game revolves so heavily around UGC. Players can build something inappropriate and have it presented to everybody in the server through a cutscene. I kind of just ignored this problem in the original game because I didn’t know how to approach it, but I know that won’t fly today with Roblox being as big as it is now.
I don’t have the resources to manually moderate everything players build and third-party solutions (e.g. Google Cloud SafeSearch) don’t work very well for detecting profanity in blocky Roblox-esque builds as far as I’ve tested. Reactive systems like an in-game report button aren’t great either because everyone’s already seen the offending object. The damage has already been done.
Asset approval on Roblox is pretty fast and I’ve only seen “bypassed” content a few times. I have no idea how you guys do it, but I’d love if it were exposed to developers somehow. Something that would let me run player-created objects through a filter to see if they’re profane or not would be great if possible.
Is there an update to this feature request? I am looking into solutions for moderating in-experience UGC as well at the moment. (Would be cool if someone could give me advice!) @darkmodeonn
I would like the ability to upload RBXM files as savedata to roblox’s datastores, not only would this give me easily compressable data for builds, but would also make moderation on Roblox’s end a breeze since they only need to validate the RBXM contents instead of some abritrary JSON or base64 stream.
It means if I report someone for an “inappropiate build”, moderators could look at RBXM files saved under their ID and moderate accordingly