Response to code safety review discussion

Hey, why not just make a system where you as the owner of the game get messaged saying:

Hey developer,
Your game scripts have been flagged for malicious behavior. Please review and make changes to this area , Song Script > Line 5 to 10: local I like big buns and I cannot lie If no edits are made the game will be locked untill edits are made… If you belive this to be a false flag please press revirew button for this > snippit < of code be reviewed by that specially-trained team.

If your system flags spesific code it shouldn’t flag things like API keys etc. If it does then some changes should be made to your flagging system.

Note for the other developers, this is a meet in the middle. I 100% agree with just not having a god damm system like this but this is Roblox.

I can give the best point to this, if we as developers are not allowed to view the Roblox source code why should we as developers be forced for our code to be looked at because " IT GOT FLAGGED "

23 Likes

I’m assuming this system only catches instances which are malicious (like intentionally inserting really, really bad content into the game for players to see, like NSFW / bigoted content), so in those cases they immediately take action on it. It doesn’t warrant giving a chance to adjust it, they normally only do that as a courtesy for small infractions for games that have a significant amount of players in them at the time.

1 Like

This does not answer some/most of the questions people are asking!
Like why should we trust the people looking through our code and some people have personal information in the code for things such as google API. I think ROBLOX should answer these questions before the community/developers are actually satisfied… because at this point ROBLOX is looking people’s code with no approval, since it says: was put in place, I’m assuming that they are already doing this, for example someone got terminated for actually adding onto the ROBLOX filter and not even letting you swear and more… So I am still not satisfied!

4 Likes

But what if the virus was inserted into the model by a malicious plugin. How do you determine that the script was purposefully inserted into the model? Also manual review would not be the most effective way to get rid of lua viruses and thus wasting the time of moderators.

4 Likes

Honestly, instead of punishing people for having viruses (accidentally or intentionally), they should just set their algorithm to purge them. I mean, it could hurt actual code, but maybe set up an appeals system if it does? They shouldn’t really compromise the code to protect users from viruses, but they seem like they really want to protect us from them. If they have to remove viruses, don’t hurt users because they acquired the virus from another source!

6 Likes

So instead of watching over our backs 24/7 to make sure we haven’t put anything bad in our code, this feature is more focused on things like those “fire spread virus” scripts inside of free models?

1 Like

Why is this necessary? If someone were to put malicious code in a game, they would be causing no harm aside from their own game breaking, which isn’t a safety risk.

What’s the worst some harmful code can do, break a game? Malicious code can’t steal accounts or cause any harm to the player (except for ruining their experience).

Code is private and if someone managed to get hold of someone else’s personal information, it would have been likely store somewhere else besides a giant block of code. Why would someone hide personal information in a block of code when there are a thousand better places to hide it?

5 Likes

Thank you Roblox for clearing this up, it’s honestly a relief. This cleared up many questions and worries I had about the new system that was put in place. I am glad that the people reviewing the flag will first go in-game as players to investigate before actually going in manually to review the code. I’m hoping that the automated flagging is tighter than Roblox’s current filter.

One question I still have is what are the strict rules that have been put into place?

Even tho my reply’s still got flagged by ROBLOX saying its “spam” I still think we should have the right to know who is looking at are script, and have a email saying are script is being viewed. And to be honest, I don’t care who is reviewing my script. I don’t want anyone including ROBOX staff to view my script just because I may have personal info. That is a big security factor because how do I know they are not going to sell said script to a third party company. And that also bring’s me to another state meant. Who is going to be looking at are scripts and fined out are in life name, or put in a swear word. The only people who could and should view the scrip’s are the Devs them self. Its not like a data miner is going to come to are ROBLOX game just to fined info. I really want to support you on this ROBLOX, but I can and I’m sorry. Not trying to be rude, but use your time to make the moderation better then just viewing scripts.

4 Likes

I’m worried about custom chat filters. If we’re making our own filter because of the many issues in your filter system, will that get us in trouble?

3 Likes

What type of safety concerns are you looking for with this system? If it’s to destroy botted games and Robux scams/similar before they ever hit the front page - I can see this being very valuable. I won’t repeat what many others have said - I have a few concerns mentioned here still as well. I am a little more confident that I won’t be banned for little reason.

5 Likes

The developers who have been caught by this automated system have had entire games taken down and moderation action taken against them over using a single swear word. This system is clearly flawed and the “special group of people” obviously pull the trigger way too soon when it comes to these situations.

There is no reason an entire game should be taken down over code Roblox deems inappropriate. Before ANY action is taken, Roblox should notify the developer and give them the chance to fix whatever is causing the problem.

This practice honestly makes me want to move my serious projects to other platforms. I shouldn’t feel like I’m being punished for developing on Roblox.

18 Likes

Asset moderation in general on ROBLOX is insane. Developers are jumping through huge hurdles to be able to even have the opportunity to be successful. People have brought up the main issues with this already, but I think that ROBLOX needs to do a complete overhaul of asset moderation in general; code, graphics, audio, everything. The system does not work for the developers, the most important people on the platform.

6 Likes

Although this does answer some questions and concerns. I am still curious as to what happens if us developers have webhooks running through Discord? I have 2 webhooks that show me who joins my game and another that shows me if any errors/bugs show up in output in servers I’m not in so I can live fix them. Are webhooks at risk of being flagged?

3 Likes

I doubt it, but I’d say you’re more at risk of being flagged by Discord

5 Likes

Another user created a filter for his game that censored out ROBLOX scam links and bypass swear words. However, he was banned/term’d for doing so. He’s still trying to appeal. I don’t recommend setting up your own chat filter at all. I’m afraid we’re just stuck with the current faulty one.

5 Likes

Well, the main focus on this thread is that a guy that had a team create got his account terminated because another guy that had access to the game put a bunch of inappropriate assets and reported the game.

But that is also a good issue to talk about too.

1 Like

This process was put in place to identify and prevent malicious activity on the platform and is intended to stop such activity without disrupting legitimate developers.

I’m hoping this also extends to free models. If this does not then this isn’t worth resources at all.

We’re looking to flag content that’s dangerous or harmful to our community, not find swear words in scripts that would never be seen by players.

I’m glad that got cleared up.

One of the biggest concerns we observed was around code privacy and protecting personal keys.

Well of course, everyone was under the impression that other people would view these private scripts. Alot of that discussion could of been halted if you were to reply in the original thread clarifying that. Unless you actually did this and just now changed it to that.

Our automated review system looks for malicious behavior in code. In the rare event that code gets flagged by the system as a potential safety concern, a very small, specially-trained team goes in-game as players to check it out.

We still don’t know who the “specially-trained” team is and how they qualify to provide unbiased and precise judgement.

In certain cases, parts of the game’s code may be manually reviewed by that team, who will check to see what the specifically concerning code does. We have strict rules in place determining when developer code can be seen and this is only done in the context of platform safety concerns.

I thought this was suppose to be for things that can only be seen in game? What fits the criteria for a team member to directly access the games source code for manual review?

We generally do not consider Team Create sessions to be shared content, these sessions are private; however, if someone with access reports offensive content, we will investigate the author.

Alright here are my set of questions.

  1. Who do you refer by author? The person that created/inserted the reported assets or the game owner?

  2. So does that mean you access the game directly through studio or investigate through in-game means?

  3. When moderation has been taken place how do you confirm that you have the right guy?

  4. To ensure that moderation isn’t solely held responsible by the game owner and only the person who inserted these malicious/inappropriate assets do you tag created/inserted items by the creator/owner?

2 Likes

If you live in America, this is the Roblox version of the Patriot Act. It is the best course of action in light of whatever darkness you can still manage in Roblox sandboxed Lua.

2 Likes

I will be fine with people moderating my code as long as one thing happens…

We know exactly who is moderating our code. Full names, not usernames. When you say a “special-trained team,” that isn’t enough information. I do not want someone random looking at my code. As long as I know who they are, I am fine with it.

If this isn’t provided, I will not develop on ROBLOX anymore.

4 Likes