I understand taking the ‘social hangout’ aspect of the game into account on the questionnaire, but I feel like the enforcement of it is a bit extreme and it also creates a lot of grey area in terms of what is and isn’t deemed a “social hangout”.
PS: I know I’m late to the party, this is the first time I’ve heard about this
Here’s my confusion on this…
About a week ago, my game’s content label was rejected, and at first I couldn’t figure out why since the game is a literal baseplate. It is devoid of any content whatsoever, just a vanilla classic baseplate. Obviously, since the game is nothing but a vanilla baseplate with nothing to do, I couldn’t imagine any part of the questionnaire that I would’ve been missing. After re-taking the questionnaire, the social hangout question was the only one I could think of that would’ve been problematic. After selecting yes, my experience was given a Moderate (13+) maturity label.
You can probably see where I'm going with this. The issue of determining and enforcing based on whether a game is deemed a "social hangout" is not at all based on the content within the game, but rather the lack thereof. The only reason my game is a "social hangout" game is because there is nothing to do except walk around an empty baseplate and talk in game chat.
If you go on Roblox Studio right now, create a new game using the default template, and then publish that template - your game is considered a social hangout and you are expected to have a Moderate (13+) maturity label on it. I don’t mean to point fingers by any means, but I do want to compare this scenario to other games to put things into perspective how silly this policy is compared to other standards.
- a literal baseplate. - literally just a default baseplate template
Rated Moderate (13+
) on Roblox.
- Squid Game - based on a gruesome series that is rated TV-MA (17+) in the USA
Rated Mild (9+
) on Roblox.
- Counter Blox - designed to be a replica of Counter Strike which is rated M (17+)
Rated Mild (9+
) on Roblox
I understand the concept. Social games do attract certain types of people, and I completely acknowledge that it’s especially true with the baseplate. However, it just feels wrong moderating such a wide spectrum of games based on a very vague and unspecific policy. Saying that a tactical shooter game involving terrorists - or a game based on hundreds of people being murdered for the appeasement of wealthy elites (maybe being a tad bit dramatic) - is more appropriate for children than a hangout game… sounds a little goofy.
At it’s core, players in a social game aren’t subjected to anything that isn’t possible in essentially every other game on the platform. Take Fencing for example - a game that isn’t necessarily a social hangout game, but is notorious for exploiters, bypassers, and player toxicity. The only fundamental danger that ‘social hangout’ games have are simply vulnerabilities of Roblox’s platform (via exploiting, chat bypassing, etc.), which exist platform wide. Even if your game has a custom chat module with an efficient custom filter and a solid anti-cheat, it wouldn’t matter.
My biggest issue with this policy is that in a way, it directly acknowledges the vulnerabilities of the platform as a whole, but instead of taking a step towards tackling the vulnerabilities themselves, it feels more like a half step back, only compensating for the existence of those vulnerabilities. Especially with the vast amount of games that might fall under this category, it seems farfetched to enforce it on a consistent basis.
It’s also strange that games like Pls Donate or Trade Hangout, which are completely social based games at their core, are able to avoid the Social Hangout label purely based off of a simple donation mechanism or even just the underlying topic of conversation being Roblox trading. It begs the question - what is the massive difference that allows one to be suitable for All Ages as opposed to the other that is forced to be labeled 13+?
Ultimately, there are a lot of questions with this policy. I don’t think it directly solves any specific issues. The fine line of having a game that is labeled Moderate (13+), and then doing nothing else but adding a theme or small activity to the game to suddenly have it be fit for All Ages just doesn’t hold up in the grand scheme of things. Again, I’m not trying to point fingers and claim that all of these games should have their labels moderated. I’m also not dismissing the obvious issues that social games currently have. I was happy to see the chat restrictions on accounts under 13, and although it still doesn’t address the source of the issues at hand, it was a viable solution that works on a wide scale. I just don’t think this “social hangout” policy is helping, as opposed to being obstructive for creators.
This isn’t coming from a place of jealousy or spite from my game being moderated. I am the last person to sit on a high horse and pretend like people’s behaviour in my game hasn’t been problematic. The behaviour of some of the people who join my game, and many other games alike, are not suitable for children. I am totally content with my game specifically being 13+. Child safety is a very important issue, and I have always pushed for ways to prevent this behaviour on a platform level.
I’ve been trying to solve the chat bypassing issues on my own, but every time I add more to the filter, people inevitably find ways to work around it. It always feels like I’m trying to jerry rig a script together to intercept the Roblox core scripts, or find half-baked solutions to the problems that are too obstructive to the player’s experience. The fact that the only solution I’ve found to filter and remove messages from the chat directly… is either by creating an entirely custom chat module, or by executing this random line on everyone’s client - “message.Status = Enum.TextChatMessageStatus.InvalidTextChannelPermissions
”… says a lot about how inaccessible/difficult it is to try and relegate the core vulnerabilities that Roblox has on our own.
All in all, you can’t expect every developer to know how to engineer their own efficient chat filter/anti-cheat, and until the vulnerabilities are addressed on the platform as a whole - or solutions are made more widely accessible, the safety of children on any game will still always be at risk.
This concludes my Ted Talk.