Since “bots handle automation” is a speculated claim the burden of proof would actually be on the person making this claim because often this rumour is provided unsubstantiated. It doesn’t take much effort to do your own research - that is, Roblox specifies to what degree automated tooling is involved in the moderation process both through their knowledgebase and their SEC filing.
Roblox Community Rules. Section II (“Additional Rules for Developers”), Rule 5: Reporting Issues. Accessed 29 May 2021.
Multiple systems are integrated into the Roblox Platform to promote civility and ensure the safety of our users. These systems are designed to enforce our policies, protect users’ personal information, and abide by local laws. We leverage text-filtering, content moderation systems, and automated systems to proactively identify behaviors that may violate our policies. A human review team is continuously operating to evaluate flagged experiences. During the nine months ended September 30, 2020, our human review team evaluated over 68 million assets. Assets refer to images, meshes, audio files, and video files that developers upload to Roblox to include in their experiences. Roblox operates a customer service portal that profiles self-help information along with ways to contact Roblox via email or from within the Roblox Client. In the nine months ended September 30, 2020, Roblox responded to over 9 million customer inquiries and had a human respond to all actionable safety issues within 10 minutes of their submission on average.
Roblox SEC Filing 2020. PROSPECTUS SUMMARY, Our Products and Technology, Safety. Accessed 29 May 2021.
No, there’s no evidence that “moderation is handled by bots” is true. Yes, Roblox does clearly state that they use bots only for filtering content, prescreening and helping to sort issues in the queue for human action both in a help page and a legal document by obligation.