Clarification on code flagged for safety review

It’s understandable that you need to keep people safe and prevent bad actors from writing code that could potentially cause harm in some way to others, however,

This is very concerning to many developers, myself included. I honestly do not see the need to moderate code that is not visible to anyone but the game developer.

Code in free models on the other hand is understandable if it gets moderated, however closed source code can contain private and sensitive information. Although the people who are reviewing the code are paid professionals, I still don’t feel comfortable having others look at that sensitive information.

Aside from that, honestly… do it for the reviewers if anything, save them from the horrible, eye burning habits people have…

5 Likes

That’s what I may have resort to as well (as much as I’d greatly dislike doing that, knowing how many bugs I accidentally let slip past), but…

…what if they just will outright give warnings/bans/terminations for “unreadable code” just like they give warnings for “unreadable text”?

That’s what I’m fearing, like it’ll inevitably happen (and I’d be really [un]surprised, bemused even, should that happen)

11 Likes

This is a very unnecessary system and I don’t like it.

Me and many developers when we were younger used free models in our old games, and the chance of those models to have innapropriate words in them is very high and it would make no sense for those to be the reason we get banned for.

No player is able to see our scripts, only exploiters are able to see them, and only the LocalScripts at that therefore there is absolutely no need to moderate what we write inside our ServerScripts where no other soul will see. But if a player sees the code you might wanna ban them for exploiting and not the Developer.

The only reason I could think that this system could come in handy is only for free models, and uncopylocked games to be moderated.

I remember you stating that “All text seen by players or that can be modified by players must be filtered” (or something along this), so scripts do NOT need to be filtered, the player has no access to it and only the developer. I don’t want this system to give me a false positive and the game to be disabled or even worse, the account to be deleted just because it though this word was innapropriate and the moderator to not even read the script and just decide to ban/disable it.

This is an useless system that should have not been added and I will not change my mind so might as well start obfuscating code like Berezaa said.

13 Likes

While this entire system may be found concerning, the fact that roblox has not addressed these concerns is more concerning. When big topics arise, roblox administrators is no where to be found.

8 Likes

@berezaa brought up the most concerning aspect for me: access keys for APIs & their endpoints. I don’t care if mods see me code, but I do NOT want my supposed secure keys to be snooped at by anyone. That is a huge security risk. This is just inappropriate.

I demand that we have a new service added to store sensitive information like that. Similar to AWS’s SecretsManager.


If I were a professional game developer that was exploring Roblox as an option, the fact that my code could be manually audited at anytime without my permission would immediately steer me away. Roblox needs to start thinking about treating this platform professionally.

138 Likes

👎 This is very, VERY dangerous and stupid. I write my code the way I write it, and I shouldn't have to let other people view my code because an imperfect filter system marked a false positive

There is virtually no way for “specially trained” moderators to tell if personal information is real of if it’s a fake identify of a character, and absolutely no way for an automated system to detect.

I should not have to suffer unjust moderation because of an imperfect system.

Deal with inappropriate scripting on a case by case basis. If someone makes their game do something horrible, moderate them. Don’t make everyone else suffer instead.

Also, how will I make a custom filter for chat (in addition to Roblox’s filter)?

Again, this is absolutely unnecessary and will lead to more robust and unneeded obfuscation from developers. This is the kind of change that needs to be totally reverted, no questions asked.

At bare minimum, this should ONLY apply to open source code. Otherwise I might as well quit Roblox.

61 Likes

This system can never serve its intended purpose & will only affect legitimate developers

This by itself is pretty much crazy from a technical standpoint - analyzing code is already extremely difficult as is, any sort of automated system is bound to fail with someone even trying in the slightest to hide their intentions.

Lets pretend that the word ‘interesting’ is discriminatory language - we can show how to effectively conceal this string in less then 5 seconds:

print("interesting") -- this would obviously be detected by such system, but lets put a tiny bit of obfuscation over this.

local Key53 = 8186484168865098
local Key14 = 4887

local function decipher(str)
  local K, F = Key53, 16384 + Key14
  return (str:gsub('%x%x',
    function(c)
      local L = K % 274877906944
      local H = (K - L) / 274877906944
      local M = H % 128
      c = tonumber(c, 16)
      local m = (c + (H - M) / 128) * (2*M + 1) % 256
      K = L * F + H + c + m
      return string.char(m)
    end
  ))
end

print(decipher("4578c511b9d502830d468f")) -- now our string is magically gone!

Unless Roblox does dynamic analysis on this code, there is effectively no way of actually telling this script has malicious intent. And if they did do dynamic analysis, there is 1,000 different anti-sandboxing techniques that can be used to defeat such system aswell.

There is also the case of people just using one of the many full script obfuscation solutions that can conceal the entire contents of the script from analysis - these have been described earlier on this forum, and are already widely used by the bad actors/exploiters Roblox is trying to protect against in the first place.

Lets recap:

  1. This system will have ZERO effect on any actual bad actor who wants to hide their intentions. This is an arms race you can never win, and has been played for over 20 years by malware & anti-malware software.
  2. Legitimate developers who want to add custom chat filters or similar can (and will) be affected by this - see the Scunthorpe problem for more information about this.
  3. There is also the legal & IP side of this - random moderators reading scripts with possible trade secrets or API keys is not a recipe for success.

In short, this system is deeply flawed and can never succeed. Individual moderation is still the best option in these cases, and at least can put context into view unlike an automated one.

60 Likes

Agreed. What if I write a random string generator and one of the strings it generates happens contain a profain word? This has happened before to an ID system I made.

Never before has there been such a unanimous level of condemnation from the developer base. Without developers, Roblox would not exist.

I’m seeing better treatment of YouTuber star creators then developers. We need staff to step in and make changes + discuss things with us.

26 Likes

Who are these people specifically checking code?

If there is a bad script found, whats the next step in moderation?

Is there going to be better improvements to a key topic, moderation? Like chat filters, report system, etc.

3 Likes

Developers: Exist
Roblox: Wait, that’s illegal!

In all seriousness, this is just a waste of manpower and could be better put in improving the current moderation / filtering systems.

26 Likes

This is not true. There can be malicious code that affects either other developers or players.

I just had a feeling to clarify that.

Specifically, I meant code developers wrote themselves in a way that won’t hurt anybody but themselves. For example, like others said, what if developers wrote in their comments curse words, just as a way to vent their frustration dealing with a bug?

3 Likes

Is there a list of all words that aren’t allowed like a blacklist?
Since I probally used some words but it feels crazy to check all scripts in even old games of mine if I ever used a bad word. (Not all English bad words are considerd bad in my country).

I feel like local variables and comments shouldn’t be counting with this as only you or the developers can see it. Other people like exploiters are not even supossed to see it.

5 Likes

I 100% do not agree with this. Unlike other developers, I’m not concerned as much about my code being opened up. However, this is a waste of manpower that could be put to better use somewhere else, like plugin and free-model checking.

Keep in mind that just because I say “I’m not concerned as much about my code being opened up,” does not mean I’m not concerned at all, and I’m definitely worried about other developers.

I will cover a few issues with this:


Ok, this is a bit complicated. Some developers obfuscate their code in order to hide references to key instances or to hide precious code. However, this may seem suspicious when a moderator views a list of explicit language out of context. In this scenario, I’m afraid of the “better safe than sorry” point of view.


Sounds good at first, but can easily mess things up just as much. Of course, you can’t have every script ever made moderated (that’s completely unrealistic), so a flagging bot in place is a good idea. The issue I have is people abusing the features of a flagged script in order to overwhelm Roblox with moderation, making it more difficult to review code.


I somewhat agree with this, but again, there are issues. What if there is content targeted towards older audiences (language, blood scripts, etc.), and it’s shown clearly in a script. Will this be taken into account, or will it be moderated like everything else?

Code should be taken in different context and understandings. Of course, the Terms and Rules should still be followed, but older-audience games shouldn’t suffer for going borderline.


I do respect that Roblox is attempting to take extra steps to protect players and developers alike, but it could be done so differently.

6 Likes

Yea, sometimes developers like me will try to be funny to ourselves and name variables curse words; it’s easy to remember and more entertaining to write :stuck_out_tongue:

1 Like

I don’t really see the purpose of this. Unless the code is open source, there is no harm in the code having inappropriate language. This policy seems to miss the point of the community rules in the first place, which is to protect the community.

Nobody is being protected here.

11 Likes

100% on this. If a publicly available model has scripts containing vulgar language, I think it is fair game to moderate it on some level. But, private code that only you and your team can view? That’s just stupid.

9 Likes

If a script was found to have content which violated any sort of rules, would the game/place, script creator or game owner be punished? I feel that some people have scripts with content that does violate rules but are unaware of it because they either used a free script or a developer has added it. Another issue is, if script creators are punished for offensive content, but the actual words/message that was offensive was contributed by another member on team create, would a system be in place to acknowledge that?

1 Like

I often put offsite website links as references in my code through comments. These would be sources I have used to be able to write my code and are put there for future reference only. I hope this doesn’t get my code flagged since these offsite website links normally do not consist of the exceptions permitted by Roblox’s Community Rules.

6 Likes

It sounds like Roblox wants to check your comments too, which is unimaginably stupid for closed source scripts. Even for open source scripts, this whole idea of moderating the text in scripts is questionable at best. Who is Roblox seeking to protect with this? This helps almost nobody and potentially harms many.

4 Likes