Thank you for this question! We are piloting an avatar creator feedback group now and hope to roll out a broader channel in the near future. More generally, we’ve been trying to spend more time with UGC creators at events, such as at Gamescom LATAM and RDC, and I’ve definitely been learning a lot about the challenges facing the community.
We are not currently working on dynamic clouds, but could you let us know if you have specific feature requests? We’ll make sure they’re on our radar for consideration and future roadmaps.
We welcome advertising on our platform from all brands and creators that adhere to our Advertising Standards, Community Standards, and Terms of Use.
The Engine Open Cloud API for Executing Luau will allow you to run a Luau script against a place in the Roblox game engine via Open Cloud. You specify a command, we spin up a server, load the place and run your code. You can then request the logs and return values.
We’ll have more details on this very soon as we are close to announcing our Beta.
Note: the term ‘universe scripts’ has been used in a few different ways over the years at Roblox. This isn’t something we’re actively working on right now, but we’d love to hear more about what you are looking for in a feature like this.
Thanks for your question! Here’s how the Open Cloud Engine API for Luau execution works:
- You make a request to create a task - including the universeId, placeId, placeVersion and the string you want to run as Luau.
- We launch a server, load the place, and execute the code with DataModel and Engine Luau API access.
- You poll for the operation status, and upon completion, receive the script’s return values and logs.
We’ll provide sample Python scripts to make this all nice and easy, and more details soon!
As part of sunsetting Compatibility Lighting, we recently added a ColorGradingEffect instance, which can be used for tone mapping. Thank you for the feedback around the other effects – we’ll add them to our feature request list, but we’re not currently working on them.
Those use cases are great, and things we aspire to support over time. However, as related to direct access to specific HW, one of our core principles is that the experiences you build should be able to run wherever Roblox runs, from Playstation to low-end-mobile to high-end PC. Using specific hardware instructions could prevent an experience from running if the hardware isn’t available, so we would approach this from a higher level of workload mapping to available HW.
Thank you for the question. We are enhancing Avatar Auto Setup to run on partially complete models, and hope to release this by Q1 2025. We are really looking forward to seeing how the community uses it!
Additionally, we’re working on a long-term plan for modesty layers and 2D clothing, but we can’t share a specific timeline at this stage.
Thank you for your question. We are working on improving alt detection using multiple techniques to make it as reliable as possible while still carefully protecting our users’ privacy. Unfortunately, we can’t comment on specific approaches at this time.
We are very excited to make these improvements. We know they’ve been a long-time coming and want to make sure we get them right. In the near future, we plan to roll out visual improvements like alignment (i.e., the zoom and border padding) and lighting improvements that would be standardized across each category.
Looking further ahead, we are exploring giving creators more control over these thumbnails (such as selecting camera angles, background colors, and poses).
Thanks for this feedback. Presently, we have no plans to enable uploading of audio assets from within an experience. Creators can continue to upload private audio that complies with our policies, or use the growing library of public audio from the Creator Store from distributors like DistroKid and Monstercat.
Great question. For the pilot of our affiliate program, creators will earn up to 50% on qualified Robux purchases from new users for the first six months. You can expect that over time, we’ll tune all the parameters by refining the long-term incentives and payout window based on data and creator feedback to ensure the program is both rewarding and sustainable.
Thank you for the request, and we are definitely aware of the demand for UI blur! Right now, our UI team is focused on improving the performance of our UI system. When we switch back to feature development, we will definitely be investigating blur.
Thanks a lot for the question – we’re excited about the accessory refinement tools too! As some of you may have noticed, we’re currently testing our accessory refinement tools in the Avatar Editor with a select number of users.
The plan is to roll these out to all users and launch them as APIs for you to implement in your own experiences. Our current tests are the first part of the rollout and the developer APIs will launch later this year, so stay tuned!
Thank you for the request, and we are definitely aware of the demand for UI blur! Right now, our UI team is focused on improving the performance of our UI system. When we switch back to feature development, we will definitely be investigating blur.
Shadows are in a similar place. We’re aware of the demand and have it on our feature requests, and we’ve noted the interest from this AMA. Our primary challenge is ensuring everything we provide is done in a performant, cross-platform way that works for all of your use cases.
In addition to the proactive moderation work, we recently launched a stop-gap solution for creators to request their avatar items be removed. It’s true that this form does not support bundles as of now, but we plan to launch a self-serve way for creators to remove their items and bundles in the coming weeks. We will let the community know when this is ready when we get closer to launch.
Thank you for the idea! We have nothing currently planned, but vector scaling definitely lines up with our goals around content that easily scales to different devices, especially for UIs.
We’ve had a great time hearing your thoughts and following up on what we announced at RDC this year. Your advice and feedback are invaluable as we forge ahead with our vision for the future of creation on Roblox. We can’t wait to connect with you again in the next AMA.
Until then, keep creating!
We investigated a few options to enable ‘pay what you want’ for plugins, but ultimately decided that a more traditional pricing program was best for Creator Store. Good news though - with the support of a parent or guardian creators ages 13 to 17 are able to sell plugins.
We are working hard to make Studio work better with the other tools that you use. We will soon be releasing an early preview of Script Sync in Studio that makes it easier to work with external code editors like VSCode. We’re also having internal discussions about how we could extend to other file-based applications like Blender or Photoshop, so stay tuned.
We’re also looking closely at how we can give plugins better access to our DynamicImage and DynamicMesh APIs (including publishing to an asset) so our community is not bottlenecked by our progress.