This will be one of, if not the most destructive update in Roblox history, if this gets into fruition.
Not only are we going to see one of the most massive player counts drops, which means less money and maintenance for both the company and the developers, but it’ll also decrease safety as ID leaks can result in Identify Fraud, Identify Theft, and increased chance of Child Endangerment.
This will kill the platform, and the company with it. The company relies on the platform in order to generate revenue.
The only viable options to ensuring the safety would be hiring more mods (not from India), improving AI moderation and parental controls, and giving developers more moderation tools.
Roblox should’ve never gone public in the first place.
I have a number of concerns which I haven’t seen raised yet.
What is Roblox’s plan to ensure this doesn’t dismantle communities? Players in our community feel safe - they can reach out directly to one of our moderators or even a developer to report concerns. They are also able to seek advice and share concerns in our moderated chats. From my experience, there are strong safety benefits
to having a healthy community.
Are we still going to be allowed to share our Discord server invite with 13+ users? This would technically bypass the age group restrictions, but still seems to be allowed!
If no one in your age group is playing the game, will you just be put into an empty server? Lots of games need a large group of players online to work and be fun, so if there is nothing to prevent this, the barrier to creating quality social games is raised and many games may die within certain age groups or bring in less
revenue.
Have Roblox undertaken any risk assessment on the false sense of security that separating age groups may provide? AI age estimation is not an unbeatable system - those who are dedicated (or lucky) enough will be able to get around it. However, the message Roblox is sending out to parents and young teens is that “You are safe, only people around your age can chat to you”. I worry that this may make children and parents lower their guard and make them more susceptible to grooming, as they would be less likely to question claims that the person they are talking to is the age they say they are. Perhaps parents who would not allow their child to communicate at all may let their children communicate, or a child who previously wouldn’t have communicated off-platform now feels like it is safe enough to add a person they’ve met on Roblox on a third party like Discord. This is my biggest concern, and one that I don’t think has been mentioned elsewhere, so if you look at nothing else please respond to this one!
Has Roblox done any risk assessment on the impact on the use of off-platform chat? Have they considered that too harsh restrictions may drive players onto considerably less safe platforms?
From my experience, Persona and other age estimation tend to underestimate your age. This is preferable to overestimation for most use cases e.g. restricting features or the sale of alcohol, but could be massively detrimental for safety when adding restrictions for older players. For example, I am nearly 21 yet the facial verification puts me in the 13+ group - which would allow me to chat to players much younger than Roblox thinks I should be able to. And as mentioned above, they’re now potentially more likely to believe a lie about my age. Has Roblox made sure that the rate of underestimation is sufficiently low, and how?
Discord had a leak of sensitive verification personal data through a third party service that claimed to delete data straight away. How is Roblox monitoring Persona, and who will be held responsible in the case of such a leak?
They cant seem to make up their minds with wanting to keep players on platform. Is ‘discord’ suddenly apart of Roblox now? All it takes is a few clicks and one quick account that often doesn’t even need verification in order to be there in order to have a completely unmoderated conversation. Is it just to help daddy Roblox avoid accountability while still leaving an open hole for bad actors to operate?
you know, I like roblox because I can play games that maybe younger people would like or vice versa, but entering any of those games will be just EMPTY now.
Furthermore I’m not going to be able to develop my games with Team Create now!
Our game uses custom matchmaking which works across multiple servers. Will developers be able to access APIs on text chat age groups? If so, how are Roblox addressing player privacy (a developer can deduce a Player’s rough age), and if not, what are the plans to ensure large games like ours that need custom matchmaking can compete?
yo. Yo. YO. ROBLOX. What are we even doing now…? Like I get it, “nO mORe PreDATorS” but OMG, this "solution just made stuff SOOOO much worse. I might not be able to work with my friend anymore, and any predator with half a brain cell can use a fake kid image yo trick this stupid AI, and BOOM! Now they have exclusive access to all of the kids on the site. Maybe you could have more legitimate humans doing moderation, rather than an AI that can’t tell the difference between numbers, slurs, and normal human conversation.
Roblox is basically limiting adults, teenagers, and children from talking to each other.
This is a terrible idea, whoever decided to post this and move forward with this, gotta be fired.
Adults can now MORE EASILY talk to teens and children now, roblox just filtered them
And instead of banning the accounts/experiences that promote this, they decided to do this. That’s like restricting students from talking to teachers, or teenagers talking to store employees, humans are meant to communicate with each other.
Now, I understand that they’re are creepy people on the internet, roblox has a terrible moderation system, and I don’t mean to put the blame on the parents, but you gotta start asking, where are the childs parents? Why aren’t they observing, watching, limiting, wtv they’re child on the internet, or even educating them on the dangers? Parents need to take some accountability.
Roblox just ruined every roleplay, hangout, and any game where communication is important/crucial to gameplay.
Why can’t moderators just log the chat from the server?
Also, your point of people going off-platform to chat doesn’t consider that you can still chat, I believe it’s easier to do the age checking than redirecting users off-platform, especially when you can’t even chat with them. Many people won’t do the age checking but I believe kids will do which are the ones endangered by going off-platform.
Your point about adults bypassing the check to talk with kids is totally valid tough.
I think this is a very important question that Roblox as a company needs to answer going forward. Games that have competitive gameplay design intrinsic to them already have difficulty with matchmaking players effectively based on skill statistics like elo and MMR. If you add an additional contingency on top of that (filtering based on age), it becomes far more nuanced and difficult to maintain as a developer.
Imagine you run a competitive shooter game where the top 10% of your player-base (for example purposes, let’s say 300 people) are above the rank of Platinum. Then above that you have Diamond and Champion. Statistically that leaves like what, 10 players at the highest ranks in the game? Imagine filtering for age AND region on top of that and trying to queue and play your favorite game. It’s an impossible situation that’s going to lead to a downward trend of player retention.
Now couple that with the recently added changes to how premium payouts are calculated. Specific types of games that don’t cater to literal brain rot and mindless grinding are already struggling with those new additions, but with this on top of it? We’re talking about THOUSANDS upon THOUSANDS of dollars in revenue lost per year for the developer as a result of this asinine chat update. The worst part about all of this is; to the average player, this looks like the developers’ fault. How does Roblox intend to mitigate such a significant downtick in players who matchmake in competitive spaces?
I seriously think whatever team that’s responsible for trust and safety needs to have a serious conversation about how they view the future of this platform. At this rate, I think I can speak for a substantial amount of the contemporary developer-base who view this as a runaway train and they’re prepared to hop off of it.
Honestly, considering how long we have until this goes into full effect, it feels like this update was a rush job cooked up in a single meeting. There’s barely been any communication on Roblox’s end. I feel like this wasn’t thought about at all before the announcement was made.
I’d be extremely surprised if the trust and safety team alone had the power to make such a consequential change to the platform. This is the sort of decision that would have to come from the CEO directly.
(Also to clarify to other readers - as Roblox has its own “custom matchmaking” - I’m referring to matchmaking using reserved servers which I have custom made in Lua, using a ‘hub’ place.)
This is the most misguided attempt at providing children safety this platform could have gone with.
Segregating people on the platform by using an unreliable Ai to guess their age from a picture will not fix the problem of predators, bypassed items, condos, dating groups, and all the other unsavory issues on the platform.
It puts kids in more danger by making moderation much more difficult for game owners, taking adults and older teens away from where they may be able to call out bad actors before anything happens, and giving predators a half private chat full of kids.
The bad actors WILL bypass the Ai, they could even use other Ai’s to do it. You cannot and should never trust these automated systems.
Instead of this privacy breaching, unreliable, and widely disliked idea please consider the following:
Hire actual human moderators. I have tried so hard to get obvious NSFW related rule violations dealt with and your Ai is flat out not capable of recognizing what a human would.
Educate kids about online safety. Maybe even be obnoxious about it by making people take quizzes occasionally. Education is power. Give the children the power to recognize a dangerous situation and tell them WHY it’s dangerous. Don’t just tell them not to talk to strangers either; it is unhelpful in recognizing predatory behavior.
Work with people who actually want to help the platform. Aka listen to the community and developers.
Stop using Ai for anything important like this. Facial recognition can be easily fooled, Ai moderation isn’t equipped to handle the job.
If/when this update goes into effect, I will be privating my game, and anything else that could potentially earn this platform a cent from me, and begin the process of moving platforms.
I highly encourage others whose games will be drastically effected, who generally don’t agree with this decisions roblox has been making lately, and/or don’t want to give their face to the data harvesting Ai, to consider your options for the future as well.