Response to code safety review discussion

Exactly. At minimum, we should know who is viewing our code when it’s being looked at.
How were these people trained? How long were they trained? Who are they?

In another point, I want to address that Roblox still does not explain why our code needs to be checked. It makes no sense that closed-source code would be “threatning”.

They continue to be vague, as almost as if they don’t really know what the answers to our questions are. I feel like this is just pseudo-security, at best.

I warn Roblox that they can potentially get sued for having a false sense of security. This is no joke, and it happens all the time in the real world. Roblox must ensure what they are doing is legit, and why what they are doing is keeping everybody safer. You cannot simply say something makes the platform safer, and therefore it is.

Additionally, being vague will make many developers upset (as seen here). We can see that you guys have no reasoning, because no responses by Roblox staff are given. This is a horrible business decision, and should be re-thought.

We still want to know what code you are checking, when you are checking, how it helps, and who specifically is checking it.

Ignoring us will not do you good. Listen to the developers, not yourselves.

16 Likes

As others have said, many concerns were neglected here, including:

  • What protects our private assets from being seen and (worst case) stolen if for some reason one of us crops up on the radar?

  • You mention team create is not considered shared and the majority of the scrutiny is on shared content. What about group games, then?

  • How is this system trained? What, specifically, does it look for? Are false positives punished or false negatives punished? How is the team trained?

There still are a few ambiguities, but this does help clear up some of the general intents behind this policy.

This does feel like security theater, you’re right.

11 Likes

So if I understand correctly, the process is:

  1. Automated code-inspection bot flags games for review when specific patterns are detected.
  2. Specialists enter game as normal players to see whether the game acts unsafe.
  3. If the game behaves in an unsafe manner, then the underlying code is inspected.

I’d say this is an automated ‘Report Game’ system.

That’s comforting. Thanks.

I do have one question regarding this: Does this mean that private games are not inspected at all? I don’t see how the specialists can join games that only the creator can access, unless they have special access permissions.


Could you clarify this some? Numbers maybe?

There are billions upon billions of games on the website, so how is this small moderation team able to keep up?

Let’s say there are 1 trillion games, and then let’s say the percentage of those games flagged by the bot is 0.0001%. That would still come out to a whopping 1 million games.

I know these numbers are not at all accurate, but the point is, the game catalog is HUGE, so even if flagging is rare, it’s probably still a lot.

7 Likes

Oh nice!

Thanks god Roblox explained everything about this new code review system, i thought we wouldn’t have a response. This also looks like everything that i have requested in the last thread. Well, now i feel better with more privacity :+1:

1 Like

I remember a while back that an announcement was posted about it being required by the Roblox rules to disclose the correct statistics for case/crate/random drop openings in your game if they were purchased with the robux currency. This can be found here: Guidelines around users paying for random virtual items

Since this update, I always knew they were reading our code to verify the statistics if they believe we were lying to our players who spent robux. If they suspected it, they could just read the code to verify the correct probability of that crate opening and giving you something, and if it was off, they would apply the correct moderation or ask you to fix it.

I believe that this use case of code moderation is fine, as long as we can be sure keys are encrypted and safe in KeyService, and that the moderators won’t release our methods of effects in games/personal IP. They are just making sure the player doesn’t get scammed because of some bad actor saying that legendary items can be received 99% of the time, and making some kid feel bad because, while he believes he is just ridiculously unlucky and just spent lots of Robux, the developer just set the probability to something like 1%.

This is just one valid example that I would agree with Roblox moderating and deciding as malicious.

Another example would be free models with viruses in them. Search up anything on the toolbox. If it isn’t Roblox verified, chances are, it has a virus script in it that replicates to all other objects in the game and is tedious to remove. I remember my first place is plagued with these because it was all free models, many of which being by regular users who had malicious intent.
I agree with Roblox moderating this even more, 100%! If it’s malicious and on the toolbox for free, any user can view the code. In which, having a support team review your public code would be even safer than a normal user reviewing it, because they are employed by Roblox.

I think these circumstances and scenarios would answer your question of what code could be flagged. Hopefully this helped. Thanks!

5 Likes

Roblox isn’t checking every script for viruses. They can’t do that. That requires human interaction, and they aren’t wasting their time doing that. Its pretty difficult to determine what a “virus” is in Roblox.

4 Likes

Yes, but they have some malicious detection system that can, if provoked, send a request off to a human if it thinks something is sketchy in public free model code.

2 Likes

How is this a “waste of moderation resources” when this is an automated feature? If anything, this makes moderation easier.

Also, Roblox does not work on one feature at a time. They did not use all their resources to make this feature. It was probably made by a small team within their company.

You seem to just have some sort of hatred for Roblox and are nitpicking things because of it.

2 Likes

If the code is closed-source, that information won’t get out there. Besides, if someone wanted to use that kind of information against another person such as doxxing, putting it in a closed-source script would not make sense. So really, I don’t believe that kind of information is in closed-source scripts.

EDIT: Reading what I quoted again, it seems more like an excuse rather than a valid concern by Roblox.

10 Likes

That’s a really good point. I haven’t thought about the why reasoning behind all of this. Yes, developers may have profanity in their code, but why would this cause code to need to be manually reviewed? This is why games have the ability to be reported, where users can report the profanity or explicit content.

Another question I have is what would stop the filter from a game changing some variable names and other various parts into family friendly names that go undetected. For an example Condo Games are very inappropriate and are on roblox for around 10 - 30 minutes before they’re reported and taken down (great job to the moderation team on this one). But what would stop them from say changing a variable that says Penis that would usually have made the filter detect it and alert the “specially trained team” to something like Dog. This doesn’t make sense to me as the only code that would be reviewed would be code that 99.99% is clean and was a false positive.

2 Likes

Clears up a lot.

But still my code is mine, no one except me ever reads it, period. At least notify me when your trained team reads my code.

3 Likes

I personally think this addressed a lot of my own concerns, and has helped clarify some of the issues I brought up, however I believe there is still a lot of room for improvement.

Here’s a quick summary of what I’ve seen being areas that still need more info:

  • What is the specially trained team (in more detail)? Specifically how large are they? 5, 10, 15 members? Less? More? What is their intended procedure for checking games?
  • What explicit powers do they have, how is abuse of these powers handled? What are they allowed to do with our code? How much of our code may they access?
  • What guidelines and rules must they follow? How are these rules bound to them?
  • What potential risks does Roblox themselves believe this feature may pose for developers, or themselves?
  • How does Roblox plan to keep developer code safe from malicious hands?
  • How does Roblox handle obfuscation? Do they run code through any VM for analysis? Do they apply any analysis to strings? Are reviewers required to do an in depth search of obfuscated code?
  • How is code analyzed in offsale assets vs on sale assets? How is code analyzed in secondary places of a game?
  • What are some detailed examples of something that could (and should) potentially be flagged? What kinds of false positives could exist in the flagging process, assuming flagging is deterministic?
  • How will developers create disputes with the code review team? For example, falsely moderated assets?
  • Will games be able to be permanently locked down from being played, similarly to how some completely innocent games from 2009-2012 sit in review permanently? How will developers handle this?

Hopefully this serves as a good post to summarize my own questions/concerns and other’s questions and concerns.

15 Likes

I don’t see any scripts from places being “dangerous or harmful” , as the so called player could just leave and the game could never be played again by that player.

2 Likes

I do not have a hatred for ROBLOX. I just find it invasive and disgusting that moderators have the ability to review my code. You are wrong, this isn’t just an automated feature. Code is automated, then reviewed by a “specially-trained team.”

If someone is to report the game, review the code. Otherwise, stay out of my intellectual property.

8 Likes

I want to add that many people are saying, “it’s their platform, their rules.” You’re right, but also wrong. Developers make the platform, and they keep kicking us around.

There are multiple other platforms. You guys just hit 4 million concurrent players (congrats by the way). Do you really want to lower that? This update isn’t affecting regular players, but it’s affecting developers. If the developers leave, there’s nothing for players to play, which causes players to stop playing ROBLOX.

7 Likes

3 pointers on whats wrong with this update still

1.Roblox is saying that there looking for dangerous or harmful scripts, but did not explain on what is dangerous or harmful
2.Wasting resources in the privacy of the creators code we want are privacy of are own code
3.this does matter no matter what terms are on roblox, copyright laws exist in countries such as Canada(strict copyright law) ext, as for strict laws it could hold up in court.

3 Likes

I am also wondering this. Are private games also being checked?

The original post merely states that the system looks for malicious behavior in code. In my mind, this could include known exploits.

I have a private anti-exploit development place where I test known exploits against my anti-exploit system. I have been saving both the exploit code and the anti-exploit code in the development place (along with other documentation) for ease of reference. Only the anti-exploit will be added to the public game.

Is this fine? Or should I be saving the exploit code separately (off of Roblox) in order to avoid triggering the automated review system?

3 Likes

Just found it weird that you decided to bring up how you think the Roblox report system is broken when this new system has nothing to do with that and doesn’t effect it in any way. Gave me a “omg I hate roblox mods the report system sux!!!1” kinda vibes.

This new system would not reduce roblox’s moderation capacity as this new review team would be completely separate than normal mods (as they would be trained engineers).

Also, I think you are being a bit too dramatic. Should Roblox be able to look through all your code? Maybe not. But you are disgusted that they can review your code, that you have uploaded to their site? You are reaching a bit there pal.

I’m not 100% in support for this feature. I don’t know where I really stand, but I do think people are overreacting too much.

I do like the idea of introducing a way to securely store API keys and sensitive data in a manner that isn’t in plain text in the script.

I think Roblox has every right to review code that is uploaded, hosted, and distributed through their own platform and database. But here me out: I don’t think they should look through everyone’s code unless under certain circumstances (such as a system detected potential malicious code that could harm their users), in which then, a trusted team (hopefully under NDA and close surveillance), can review the code!

If you are so scared about your intellectual property being reviewed (which would probably never happen as Roblox has said, its rare) then go to a different platform where you are the controller.

9 Likes

Please read the post. Roblox are only concerned in cases where games are showing harmful content to players; code will only be examined where these specialists need additional context.

Of course Roblox have access to your Roblox account, it’s their website.

This is answered both in the OP and your own post. The code/game is flagged [automatically], and it is reviewed by a specialist team manually.

Let’s show kids adult and extreme content. If they don’t like it; they can just leave! /s

Please read the post. Roblox are only interested in harmful content, such as adult content; extreme content; or NSFW/NSFL content being shown to players.


PSA: Read the post and other replies before posting. Your question has probably been already asked/answered or is in the OP.

8 Likes

So I would summarize this system is broken when it goes about handling Team Create. The user is guilty until proven guilty is how I think of moderation. It’s sad to see they still haven’t implemented some sort of tagging to object creations/insertions to moderate whoever inserts the content. This will be abused everywhere unless they have some sort of history access of who created what. Banning the game owner because people in his Team Create inserted inappropriate assets without the owners knowledge is beyond unprofessional.

8 Likes