1. What do you want to achieve?
I want to know if there’s a way to automatically detect and prevent avatars with inappropriate content from entering any game—not just relying on the game developer’s end, but more platform-wide.
2. What is the issue?
Many users are customizing their avatars with inappropriate clothing, decals, or layered clothing designs that bypass moderation. These avatars end up in games where they clearly don’t belong, especially in kid-friendly or social environments. It’s frustrating because the moderation doesn’t always catch these, and as developers, it’s impossible to scan every user manually.
3. What solutions have you tried so far?
I’ve tried using HumanoidDescription
filters, and I’ve looked into using server-side scripts to detect certain accessories or clothing IDs. I’ve also searched the Creator Hub and other DevForum posts, but most suggestions only apply to one game and don’t solve the platform-wide issue. It feels like a game of whack-a-mole.
Additional Thoughts:
Let’s be real—there will always be people who try to work around the rules. It’s like trying to make a game without bugs; you can fix most of them, but one will always slip through. Perfection is the goal, but it’s rarely 100% possible.
Still, bad content needs to be handled better, and we need tools or platform-level systems to help detect this stuff before it gets into our games. It’s tiring having to report players individually when the system should already be protecting the experience.
“You can’t know what good is unless you’ve seen the bad. But just because bad exists doesn’t mean it should be ignored.”
Has anyone found a better solution or workaround for filtering avatar content? Maybe something I missed?