Comments that don’t follow the guidelines listed above are less likely to be answered. If you have questions outside of today’s topic, please refer to Creator Docs, the DevForum, or our Creator Roadmap.
We will stop collecting questions at 11:00 AM PT, and prioritize questions that receive the most likes.
When cybercriminals are building a record on someone, they’ll sometimes try to social engineer the target’s friends into sharing personal details. You won’t be able to see someone’s connections or groups if they block you, but getting around this is as easy as using an alt account. Will Roblox be introducing privacy controls to hide this info from other players in the near future?
I understand that Schlep was sent a cease and desist for vigilante actions, however a “Trusted Flagger” program does exist. Is there any chance that Schlep and his team will become trusted flaggers?
Alot of UGC creators are Unsafe ont he platform because their accounts get terminated wrongly as a “copyright takedown” moderation for items such as fedoras, do you guys have a plan to fix that?
Roblox has spent a lot of time lately touting their automated detection systems for predatory behavior, but when it comes to manual reports from users, Roblox still does not do nearly enough to keep people safe. Many reports just go into the void or are ignored/outright denied by Roblox, even if it’s plainly obvious there is bad behavior going on. There seems to be a disconnect between Roblox’s internal tracking and the reality that users actually see, which is that a lot of seriously bad actors go unpunished. So my question is, what concrete steps are Roblox going to take to fix the reporting system so that reports of serious rulebreaking will actually be handled appropriately, and so that user trust in the reporting system is rebuilt?
Are there any plans to add alt account detection to blocks? I recently noticed that steps were taken to hide more from the profile such as social links. However, upon testing this can be easily bypassed using Robloxs build in Switch Account feature
I know that y’all are constantly working on improving Roblox and keeping it as a leading platform in Safety and Civility. But If there was no constrained limits on time, and effort required for features, and you could add or change one thing about Roblox’s core safety features instantly right now, what would it be and why?
Can you provide some insight into the process of moderating hackers? I noticed some accounts were created minutes ago to bypass a enforcement ban, like, for example, Slap Battles.
Will you guys consider upping the ugc limit for certain accessories like hair to 8k triangles considering they sometimes need to be a little more dense so they don’t look bald? also any update on the timeframe of when transparency Is coming to ugc?
It is incredibly easy to register new accounts. This can easily be done to bypass experience bans added by Players:BanAsync(). What is being done to combat this? I develop for an experience where bad actors frequently return on alts where we’ve already banned their main with ExcludeAltAccounts set to FALSE (Which would mean their alts are banned)
Any veteran Roblox player is bound to have left a trail of data over the years - and that trail is only going to grow as Roblox offers more services like Guilded and group forums. Are there plans to let users download their personal data archives anytime soon, similar to the data request features offered by Discord and Facebook?
With the revamped asset manager, are there any plans for paid ModuleScripts? Akin to how plugins can be purchased. I ask, because while plugins having the option to be paid is great, a lot more can be done if ModuleScripts can be made paid. Currently, there is no way to protect ModuleScripts from being unethically redistributed, which can be a problem for individuals attempting to sell their tools.
There seems to be a lot of RDC attendees who are getting sick, and many testing positive for COVID-19. Will there be better screening in place going forward so sicknesses like this have less of a chance to make their way into large events?
when will you take protecting children from predators and pedophiles seriously? or will you just keep sending Cease and Desist letters to people who do your job better than you? and when will you start listening more to the community or making a place to ask questions to YOUR community?
there’s an opportunity here to work with Schlep but you guys just KEEP IGNORING IT
your platform could be way safer for kids to play if you just hire more moderators and collaborate with people who CATCH predators and you know it would be safer if your reporting system wouldn’t suck and would notify us about our report being accepted and such…
Also, why was Schlep even banned before/at the same time the rule was even created in the first place? Shouldn’t he have just recieved a warning not to continue in the future?