Why are there no open vacancies for remote moderation/ support/ safety workers?

1. What do I want to achieve?

I want to bring attention to a serious issue Roblox is facing right now, and my goal is to find a solution.
With your help and support, we can get Roblox admins to notice the problem too.

I think Roblox should open job vacancies for remote moderators from anywhere in the world.
To make it legit, they should have a strict verification process, including:

  • Passport ID
  • Phone number
  • Age verification (using images of documents)
  • 3D face scans
  • IP verification (with VPN checks)
  • Location verification

Why so strict?
It’s simple: we don’t want kids applying for the role. Moderators need to be reliable and have their personal info properly verified.


2. What is the issue?

Roblox is no longer as safe as it used to be.
There are way too many child predators, exploiters, and kids misbehaving (some of them know more swear words than me). And the numbers are growing fast, but moderation isn’t keeping up.

I joined Roblox officially in 2017, but I’ve been around since 2015-2016, back when I played as a guest.
I don’t remember Roblox having these problems back then. Sure, there were issues, but it’s gotten way worse now, and it’s time to do something about it.

Here’s what I mean:

Unfortunately, this 4chan thread with more examples got deleted, so I can’t link it, but you can see the screenshot of it, but trust me, there’s a lot more like this.


3. What solutions have you tried so far?

I’ve looked for part-time online moderation jobs with Roblox, but no luck.
It seems like Roblox doesn’t care about hiring people outside of California.
(Gonna be honest here, Moderation is not something you need higher education for, any University student can moderate roblox part time)

My message to Roblox:

Make moderator jobs more accessible to people worldwide. You’ll get a bigger team to handle these growing problems, and the community will feel safer!
More importantly for you asa company, you will get your Investors trust back!
Have you aimed to create a platform for child predators?
I expect the anser to be “no”.

Give me a response, If you agree with me, or disagree. I want to know your opinion on this topic.

4 Likes

Most of moderation is automated and humans have flaws and can be biased towards specific situations while for machines everything is 1 and 0 and it is also reliable and easier to expand roblox has millions of daily active users so it is impossible for them to be relied on human moderation a lot of stuff can go wrong in the process

Roblox have pushed many updates related to parental control and also added age guidelines

Investors only care about the money and the activity of the product they are investing on

Most of roblox executors are detected by Hyperion and people that exploited will most likely be added into the next ban wave queue

Yes it does work roblox automates its moderation so if something inapproperiate is found it should take action in few minutes.

I see where you’re coming from, but I respectfully disagree.

The automated moderation system, while useful as the first layer of defense, doesn’t work as effectively as it should.
Let’s break it down:

  1. Automation is limited.
    Not everything on Roblox is fully automated. While it’s true that automod plays a big role, it mainly focuses on things like filtering inappropriate content in game text chats. And sure, it’s decent at that, but it stops there.
  2. The human moderation team isn’t what it used to be.
    The second layer of moderation – the human support and moderation team – doesn’t seem to operate as efficiently as it once did. It still contributes to banning users here and there, but honestly, the impact feels minimal. Their work just isn’t as valuable anymore.

If you watch the first video I linked, you’ll see exactly what I’m talking about.

  1. Child Predator reports are ignored.
    Automod is useless when it comes to Child Predator reports.

  2. Exploiting reports are ignored.

I’ve personally spammed reports on an exploiter recently, and guess what? They’re still not banned. That’s a major gap in how moderation operates today.

The current system is only good for preventing rule violations in chat, but for more serious problems like exploiters, it’s practically nonexistent. This is why we need better solutions, like a dedicated and accessible remote moderation team, to fill these gaps.

So is human moderation the larger the player base gets harder it becomes for humans to moderate not to mention how expensive it will be to keep it active and avoid false bans

There feature on roblox called “Block” There is also parental control its also parents’ responsibility to monitor their children’s activity

Exploiting on roblox has been dead since Hyperion was released yes as of right now executors work but all of them are detected meaning everyone who exploiting as of right now will be added to the next ban wave queue

here u can see that lack of protection is one of the reasons that roblox is losing investors.

Roblox still doing as good as it was before (probably making more money by now) people invest money to make the profit the only thing the investors care about is money

Also, every platform has the issues you listed so it’s not jut roblox that having that issue that’s just how life works nothing is perfect in this world there will always be good and bad people

2 Likes

Engineering and management positions are common in California, but Roblox has connections to a moderation company in India I think. Still yeah higher education and experience in the field.


As a fellow advocate for change, I commend your efforts to making Roblox better. Moderation is an issue that Roblox needs to handle and grow in, I agree with you.

However, I don’t think the forum is the correct way to take a stand. I wonder if it’s effective at all and assume it isn’t. What we can do is report bad content when we see it.

Roblox isn’t transparent about moderation, which makes it critical to criticize, and it seems our voice won’t change that. We don’t know if Roblox has an internal team that checks the website for bad actors. Roblox could only be using human moderators for when content is reported, and only then, but we don’t know.


@bura1414 mentioning issues of safety being an issue outside of Roblox is important. And, to that point, Roblox has an opportunity to lead in safety where, imo, they don’t right now. Of course, more can always be done and I’m sure the people working at Roblox care about this stuff.

I don’t think it’s PR talk, I think people genuinely care about what’s going on, that they are passionate about making a better Roblox too. The employees care, or at least the few I’ve been able to interact with.


I’m optimistic. Things will get better with time. Respectfully, with or without this post. Again though, I commend and support the advocating, despite the likelihood of the lack of influence.

1 Like

Moderation is a interesting topic. I both agree and disagree to some situations

The only good thing they did in the past months was adding DSA where human mods atleast try to review the content reported (But they don’t review ingame content for whatever reason) and taking down the content. There are also “User Safety Concern” reports you can file via reports that makes a human team review your report about bad bypasses etc but it turns out that most of the time they don’t do anything (For ex: bad username not getting reset) unless it’s really obvious

I get that roblox added safety features but it just seems like they are doing it to avoid moderation for users under 13 instead of fixing the actual problem and hangout games is still the same issue for people over 13 and the moderation issues exist still for users over 13

I have to disagree with this one. Joining a literal baseplate. - Roblox reveals you most of the time with Exploiters that are using this game to test certain exploits on Roblox. Sure they get added at to the banwave but the punishment is nothing since they use alts and switch alts after like 1 day so banwaves every few months is useless since alt account detection doesnt apply to exploit bans but only terminations

There are many violations on the roblox plattform such as a Roblox group that is used to impersonate roblox employees or groups that is used to transport bad stuff and despite reporting via both support and DSA they don’t do anything since their only takedown the most obvious stuff and not even in-game reports

Another issue is the mods themselves since some specific ones (Not gonna mention names) seem to always decline the same reports or moderate users for the same rule violations that are technically not against the rules in the first place

I get that everything can’t be properly reviewed and moderated but many times there are the same mods that don’t actually review reports but rather just decline them just to have them get re-submitted for a different mod to get it reviewed like shown here:

Overall this is just my opinion and shouldn’t be any hate towards anyone (Also my bad if i worded some stuff weird i am a bit tired at the moment)

1 Like

Your arguments are really well structured and wonderful, I really enjoy our conversation with all of you. The question is do you think what they described works properly? because I view it scepticlay.
Parental control is a great step I agree with this one.

But I’m sure that child predator will just play games made for underaged children after that. That is not a solution, right ?

1 Like

Hyperion silently detects executors and logs people in the database and takes action on all the flagged users in the ban waves. hyperion is ment to be anti temparing software not anticheat also all the good executors are now paid (yes they are also detected)

1 Like

But like said here:

Exploiters still get no punishment at all and many exploiters just make their own and even give tutorials how to make your own undetectable executer these days since many popular ones get detected

Exploiters don’t get a real punishment at the moment

Yea it is harder to avoid getting detected but it doesn’t matter if they get detected most of the time are saying exploiters because they are just using alt accounts

Roblox tracks alt accounts and ban history lasts for a year there have been cases where some people’s main account got terminated it does its job quite well all the good executors are monthly based subscriptions and those are detected as well

Schools or parents should be teaching kids how to protect their self on the internet and what to do in specific situations besides that i don’t think anything else can be done

Yea but only because they used the same account multiplie times i’m pretty sure

Enforcement bans are only applied on terminations and not on exploit bans like mentioned here:

The point is that Exploiters don’t care getting detected at the moment because they use most time throwaway accounts and they even admit that inside games

To solve this problem (or atleast combat it), Roblox would need to add alt account detection for exploiters and i hope they will do it in the future

(Also i don’t wanna sound rude it’s just my opinion)

While people are near the Roblox hq, Roblox can easily monitor them, but if they implement remote workers, they’d need to hire more people to watch over them. That would just add to expenses and create HR issues Roblox probably doesn’t want. I’d say as a fix for this, Roblox should hire more people who can work in person, but remote working for this would just create more expenses and work on Roblox which from a business standpoint, no one would want.

1 Like

What I want to say is, guys , we should make the job of moderator more accessible in order to have more moderators and better protection, this of course would not solve the problem entirely, but will make the platform safer.
Yes automod kinda works, but the way it works is not the best as u can see on the image from 4chan (provided in the first post) the innocent kid got spam reported for not sharing his private information with possible pedophiles, and got banned.

2 Likes

what about volunteers?
im ready to do that for free just to make the platform safer.
I just want to make it a safe place for people to play
and i’m ready to recive bunch of reports everyday, checking them one-by-one just to be sure that the platform got a little safer

I want to contribute to this
and even tho i would not be able to solve the core of the problem
I still want to invest my time.

Maybe they can’t do that since their would also need to review and supervise that those reports have been properly reviewed or something

They probably could not risk someones whos corrupt taking action on innocent people and if it’s a remote position then it’s a bit worse

2 Likes

roblox moderation is hit and miss at times so having more actual people moderating would be amazing