DISCLAIMER: The bug report made below is not intended to be a criticism of EU legislation or Roblox’s moderation systems. Rather, it is intended to expose a severe vulnerability which is being employed by individuals with malicious intentions having the goal of undermining the procedures set in place to protect minors online.
Hello,
I am a community developer for the Roblox open-world action role-play experience Clark County, which itself is developed and owned by the unionWARE development studio. Recently, our team has been under attack by bad actors who are misusing the protections Roblox offers to minors under the EU Digital Services Act in order to get high profile accounts faslely terminated off the platform. The individual(s) have already successfully taken down our holder account as well as the community owner’s account through the method I have detailed below.
Bad actors not currently living in the EU begin by using a Virtual Private Network connection (VPN) to pose as if they are a resident of the EU to gain access to the special DSA report terminal. Alternatively, these individuals can contact friends who do live in the EU to submit the report on their behalf. Then, individuals will find a default (or empty depending on date of account creation) starter place owned by the victim to be used as a decoy in tripping Roblox’s automated moderation system. Thought not explicit, bad actors will often select the most egregious violation listed as an option — typically child exploitation — in an attempt to elevate the weight of the report and its corresponding punishment. After this, bad actors simply have to wait for the report to be automatically processed and for an account termination to be issued to the targets.
By taking a quick look at the DSA’s Transparency Database using these specific search parameters, it is revealed that tens of fully automated actions were recently taken under the EU’s DSA against Roblox accounts resulting in complete termination of involved user’s accounts. Thought it is plausible that the majority of these actions were taken faithfully, we believe that many other Roblox accounts, alongside those of our team’s, have been terminated incorrectly. Additionally, further investigation of some of these reports reveal that actions were taken against accounts who allegedly uploaded content on the 1st of January 2001 (01-01-2001), which is of course impossible given that Roblox was not available to the public at that time. This leads me to one of two conclusions: there is a database error on the website’s end that is giving out a default fall-back date or; there is foul play involved by abusing an invalid date to provoke an erroneous response from the automated system.
This matter was briefly touched up on earlier this year in this DevForum post, but the post did not appear to receive much attention and the issue was neither resolved nor mitigated. It is for those reasons that I am writing today.
Expected behavior
I am expecting that Roblox and their moderation team will exercise their discretion fairly — as they always have — and work towards investigating and resolving this matter as soon as possible to prevent further abuse and targeting of experience administrations and holder accounts. I would like to emphasize the time sensitive nature of this matter, as every moment that a solution is not being worked on is more time for malicious individuals to spend abusing online safe guards for unclear motives.
Thank you for your response and concern regarding my report. Firstly, I have included specific details in the private section of this bug report so that Roblox Engineers can track specific actions during that were taken against the account. Additionally, I have lined up dates and times against the public DSA Transparency database that prove that they were associated.
To address your last point, we are most definitely using the appeals process. In fact, we have several appeals currently ongoing. The point of this report was not to circumvent regular appeal procedure, rather, it was meant to call attention to a serious vulnerability and to essentially inquire as to if Roblox is able to add preventative measures so that victims are not forced to go through the trouble of re-acquiring a terminated account.
Please let me know if you have additional concerns. Thank you!
I confirmed this through my own research, and I’d like to believe they can’t ignore this issue, because they keep processing obviously incorrect information on the EU’s website.
Sad to see this, I overseer a appeal team in a semi-major group and some of them are from the EU and we have the same thing, I was banned many times for calling them out and reported to Roblox and Roblox did nothing but ban me, not the minors under 13.
I’m sorry to hear that. Hopefully through some activism Roblox will take a look at this issue. I’m all for protecting minors, in fact I applaud Roblox and the EU for undertaking this, but I also do recognize that the internet is far from a perfect world and that there must be safeguards to uphold the system’s integrity. Thank you for sharing your story.
This is the best news I’ve received in a while! Thank you so much for your dedication and the initiative you have taken. Because of you, there will likely be real tangible change made, and we are one step closer to resolving this problem.
Keep me updated, and I’ll let you know if there’s anything from my end!
What? I think there’s some confusion here. DSA reports aren’t managed by AI, a human manually checks each report as required by the DSA. No automatic bans are issued.
Roblox is required to upload all bans (even when the user is outside of the EU, or wasn’t a DSA report) to the transparency website. That’s why some show up as automatic.
The date is probably just an error on how Roblox is uploading the ban data.
That unfortunately fails to explain why an empty, default place file could be reported as containing child exploitation material, and the report would go through and result in termination. That does not necessarily scream human moderation work to me.
There used to be a termination method that was patched some time ago consisting of exploiters joining your game and playing inaproppiate animations and then making a DSA report for a moderator to check the game out, and the moderator would assume that this is part of the game and close the game and terminate your account. I suspect this was the case here.
There is definitely some confusion here. DSA reports are being managed by AI, a human is not manually checking each report. They secretly switched it so that automation is handling the reports.
I feel like theres some confusion here. This is just a theory which seems to be false
When you make a report you can tell them to join a server with a invite. Many times they will do indeed join your server and will search for the violation. Unless these accounts are “robots” which i’m pretty sure isn’t even intended for them to join server invites, it’s safe to say they are actually human
You can see when they join you through a notification and this one specifically was 6 days ago on a report. Moderators do not seem to have a admin badge
The DSA requires all reports using the form to be reviewed manually. Every single one. No company will risk using automation as the EU is notorious for issuing very big fines if caught breaking regulations. As you saw above, humans do indeed check DSA reports.
Well unfortunately, when you make a report via the Digital Services Act - Illegal Content Reporting it does not send it to a human for review, it sends it to their automated systems and that part is very much clear considering when you used to submit a report it would have a name on the email, etc but it does not do that anymore. It used to also create a ticket and you would be able to reply to it as well.
DSA may require it but that does not mean Roblox is going to comply with the requirement, and a lot of people know that Roblox is notorious for using automation. Roblox secretly switched to primarily using bots to review these reports thinking we wouldn’t catch on, but people have noticed it. Most of the time after submitting the reports it will even tell you that it is being handled by a bot/automation.
You’re extremely stubborn dude yes, they made it much less humanized by removing the email flow where they signed off with their names, but that doesn’t mean there’s not still humans behind it.
I’m not being stubborn dude ? I’m just saying what’s actually happening. Removing the email flow and names isn’t just a small change, it’s part of how they shifted the process over to automation. Yeah, there might technically still be humans somewhere in the chain, but the system itself is primarily bots now. When you file a DSA report it literally comes from a no-reply email, no name, and the wording even admits it’s being handled by automation. Other people have noticed the same thing too, so it’s not just me. The DSA requiring human review doesn’t automatically mean Roblox is following it, and their use of automation and non-compliance with other related DSA/GDPR related things is known.