As a Roblox developer, it is currently difficult to manage a safe group wall. I have to check it daily and manually remove scam and spam comments, and even then these comments stay visible for hours before I find and delete them.
Many choose to simply disable comments entirely to avoid these problems. Personally, I find the comments are a valuable place to get feedback from users about the game, as only a small percentage of people bother to join discord or guilded servers.
I wanted to create a system that uses the GPT-4 API to automatically detect scam and spam comments and remove them, but found it is not currently plausible due to the lack of API endpoints for managing group comments.
It would be helpful if Roblox would add these endpoints. Even more helpful would be for Roblox to implement such a system themselves. New LLM technology has strong applications in moderation, thanks to its heightened ability to understand context compared to previous systems that only searched for keywords.