3D world filter

preamble

The chat and voice chat are already filtered. But in the world, everything can still be created and built, without a stop, which is much harder for us to solve than making our own chat filter system. Why is the world, where not only writing but also painting [building] can be done, not filtered, much worse things can be done there than in chat. Solve that!

As a developer, it is currently very difficult to enforce the Roblox ToS [and your own rules] in open world/ build / paint […] experiences. You have to develop a good technique and train an AI just for the Roblox ToS [and your game rules] to be enforced.
Roblox should provide us with a way to filter this content according to certain criteria. For example, we should be able to check the world for:
Political stuff
Religious stuff
or e.g.
nazi hack cross
These things are forbidden in many games, and some are also against the ToS and still occur frequently.
We can then filter the world according to our interests. The AI then returns how sure it is [number] that a combination/part [table with all parts] violates [Enum.violation] [e.g politics, nudity, religious…]. We can then take appropriate actions such as deleting it if the probability is very high [and so many more possibilities, even with a low probability we can do a lot].
The AI function should only report us smt, if it thinks it is something like that, and then tells us how certain it thinks it is. 0% should of course not be performed. But even 20% would be very interesting to know.

[If you want to know what even a low probability gives me, just ask me]

Of course, it can report several things, and must of course take into account how parts are related to each other. The AI can also give keyword comments, e.g. if it takes Enum.violation.Politic and there is a hack cross, it gives the keyword “hack cross”. There should also be things listed that are not against the ToS, but are often forbidden in games, that we can freely choose where to react and how. We should also be able to train the AI with training data for violations of special game rules, so that it can also filter such violations for its own game..

6 Likes

How would this system even work (from a technological standpoint)? The chat can be filtered because you need to make a request to Roblox’s servers to send the message (which is where it’s filtered). How could a world filter be implemented (especially when games are structured differently) without possibly degrading game performance?

You just need to have proper moderators for your game, or add a system for players to vote kick offending actors

1 Like

You can iterate through the workspace, or the script could either specify the parts or a folder/model that should moderate the Ai, and if there is e.g. a door and a chopping cross in the model, that it outputs all parts of the chopping cross as a political keyword “chopping cross”. Scanning is done when the game (the creator considers it necessary, e.g. every 5 minutes/at special events etc…] That’s why I want Roblox to give us a lot of leeway there and in the evaluation.

This suggestion doesn’t sound reasonably possible.

6 Likes

That is not that correct, it is quite possible, but it is complicated and takes time. That’s why roblox should do it, because it’s not so easy for devleoper, as mentioned above. Roblox has more resources and also wants to enforce its ToS and be/remain child-friendly. It must, and this cannot be denied, change something in the construction area in terms of moderation.

Well, many games have a lot of parts in a lot of different places. To iterate through every single part in every single ROBLOX game, will be outrageously energy consuming (and therefore money consuming, since energy costs money).

As well as this, there is no way to properly guarantee that a structure is ‘inappropriate’ through AI alone. This could leads to thousands of developers having their games taken down / moderated for a misunderstanding, which is not a good thing.

Overall, I understand why you think that this should be implemented. Moderation for games is a problem, and it can be very easy to sneak in disallowed content into your games. However, from how complex and difficult the algorithm would be to implement, to the insane cost of this operation, it has far more negatives than it has positives.

2 Likes

Please read my post and my previous answers carefully again. If you have any questions or comments after that, please feel free to ask/tell, but most of what you just said is already in a reply above or in the original post.

I’d suggest you make it yourself. This is already a pretty impossible request + games don’t use the same way of “painting” a Roblox AI can just “pick up” on.

6 Likes

but then roblox bans me if players in my game paint nazi symbols and i cant do anything about it? that also sounds unreasonable.

1 Like

Could you please read my post + reply’s first before you reply?
Developers can call a moderation function [with rate limit], which is then passed a table/folder/model with the items to be checked through and an environment table/folder/mode [the terrain can also be there], everything in the environment thing is not filtered, but included in the filter for the items to be filtered. This function then returns:
nil [if nothing was found]
or:
a table e.g table = {fund ID = {
how sure the AI is, in %, a table with the relevant items, Enum.violation[ToS violation], “nudety” }
}
The developer / script can do this at its own discretion. E.g. delete the blocks or send the block combo to a mod.

It is quite possible. Roblox also has the resources in terms of time and computing power. It is also there to ensure that the ToS is adhered to. Why should I spend months working on it and years tweaking it just to comply with the ToS when other games don’t bother?
Painting was only meant in the context of blocks. Some games use blocks for painting.
Nobody would then be forced to use this function.

Assuming how good Roblox moderation in current state, this feature will result in total disruption of EVERY game which will implement it.
You can’t just say that AI will check object1, 2, 3, and then give smth as outcome.
After reading all that text mass, I think you want to have easier way to moderate things like player builds, paintings and such.
But AI can’t help there:

  1. Info AI will give will be too innacurate and useless - “Part1 violates Politic4 with 96% chance, because with Part2 and Part3, which are nearby Part5 which is behind Part4 form together from some specific angle inapopriate thing”, but in reality that’s just regular house. You never can be sure at AI output. Mistakes of AI is UNPREDICTABLE!
    As result, no thinking developer will utilize it at all.

  2. How this AI will work on itself? Deeper than “it will check pre determined parts”, but complete review of it’s functions. How it will compare unnatural things like red alien trees, or weird rock formation based on raw part data?
    And now remember that Roblox takes forewer to do something good. Suggestion like this will be GUARANTEEDLY swiped under the rug and won’t go further than suggestion at all.

Things like this should be done by Developers themselves, but not Roblox. If you don’t want to go trough all this hassle yourself, this won’t mean that Roblox team will want to do it.

And if you gonna reply with “I have already told why this is possible and what is needed” then give straightforward link/quote to that message.

3 Likes

Thats the reason why it should give the parts, so mods can look over it.

Roblox could use the training data from the platform, but generally it would be trained on bad examples. So not alien trees but e.g nude buildings.

Its literally there job, and they are a team and it’s there ToS and there platform that they want to keep save.

Then this has absolutely no purpoice due to the fact that it will be much easier to make “Report player” button in-game, and send suspicios building to game moderation team after N player reports.

Even if training data is good (in terms of usability), this not means that AI will give correct results.
Assume that “Bee” is something that roblox want to detect. Player builds doge with black-yellow striped costume, and AI has great chance to say that it’s a “Bee”, while it’s not actually.

If that was a thing, then we won’t experience outages almost every Saturday.

Yes, it’s their job. But NOT ONLY their, it’s DEVELOPERS job too!

I mean in game mods, cuz robloxs moderation is just … robloxs moderatin

Thats why it should be trained well. And its not that bad if a few wrong detections are there.
“Thats why I mean in game mods, they can look over it again and decide what they do”.

I dont experience outages almost every Saturday. Also they could buy a new server for that.

One good reason why its our job to do that and work months on this only to keep our game safe for all ages.

It is unjust, I agree, but it is your responsibility to make sure your players do not abuse the systems that you put into your games. Besides, even if you don’t see it as your issue to solve, and would prefer ROBLOX to do the moderating, how can ROBLOX decide who is responsible for a player creating disallowed graffiti in your game?

Unless you leave a digital footprint of every interaction the player makes with the content creation system (whether it’s in-game painting or something else), it is impossible to track who makes what symbol.

Assuming you do decide to add a digital footprint into the system, ROBLOX would then have to iterate through every instance of these very quickly changing parts. If you have a paint tool, players will constantly be deleting and moving their drawings, which makes keeping a record of what everyone has done inside of the game extremely costly in terms of energy (since storing the data related to parts, such as the size and position will require many more servers, which also cost money), and also in terms of training data for the AI to learn from.

In my opinion, this entire topic can be summed up fairly easily: your system, your responsibility. You cannot expect father ROBLOX to moderate systems that you implement into your own game, as if it is an simple and easy task.

It may seem quite unfair and even a bit outdated that ROBLOX does not have a system to simply scan parts that are present within a game, and check for any disallowed content. But that is for a reason: it’s cons outweigh it’s pros.

1 Like

Looks like you have to still do that.

Why should roblox decide that???

The function would return the parts it thinks that are a violation. You can e.g group the parts a player made in a special folder…

Did I said its easy? Looks like you still did not read any of my messages including the original post. As reminder: Your platform, your rules, your safety/ ((help) your rules to get enforced) systems.
E.g (voice) chat filter, content uploud filter etc…

The problem with your request is that it requires too many technical hurdles to even get off the ground and be adequate enough for anyone to use in their games not to mention the server costs (inb4 you quote your second reply). This would be technology that other game studios and companies would approach Roblox to gain access to because not too many companies (if at all) have this.

You’re asking for a system that would likely be out of Roblox’s technological field. The only way I could see this working is by Roblox grabbing every folder/model/BasePart in the game and rendering it into an image, and moderating based on that. But as we see already (they do this for models), that doesn’t always work properly.

Yes, but like I told someone else, you are using their platform, their technology, and their resources. I would assume that you’d have the tools to moderate your game if you’re allowing people to make their own creations in it; you cannot offload your work to Roblox and expect them to do everything.

One example I can think of was a few years back when the creator of a game added a new animation for players to emote with, and players started doing innappropriate things with it. Instead of asking Roblox to fix the issue, they fixed it themselves by removing the animation. They understood the responsibility they held as the creator of the game and the took the appropriate action.

Another example are script executor games. I personally don’t like them, but if they’re properly moderating the content in their games, then it’s fine. Some executor games heavily restrict what players can access with their code and/or deny players the ability to execute scripts entirely. They could ask Roblox to develop a sophisticated system to detect if a script is being sent to the server, and change the contents of it, but they don’t.

Roblox only moderates content that goes through their systems on the website because it’s the only way inappropriate assets could be used anywhere on the site. For example: the chat is being sent to and from their servers, so I would expect them to properly moderate that, image and model uploading is sent to and from their servers, so I would expect that to be moderated, etc. Anything outside of something the developer can deal with is dealt by Roblox. However, when it comes to game content, a lot of that moderation work is up to you instead, since everything is now being done via your tools and systems.


I understand that you don’t have the time or resources for moderating your game, but that’s where you should have something in place where the people in your game can help you with that. Some games have a vote kick system, some have a mod call, and others have a built-in reporting feature, just as a few examples.

3 Likes

Roblox moderation uses AI to moderate things such as audios and images. This has terrible results, with some of the most unreasonable bans being dished out by this system. Ex. a dev terminated for uploading a photo of cookies with a website watermark. Imagine the incredible accuracy loss by adding a third dimension. A feature like this would be incredibly hard for both Roblox engineers to implement, and game developers.

1 Like