Once it’s enabled, it will work in Studio both while editing and play testing, automatically. We’ll announce it on the DevForum when we fully roll it out and hope you like it!
Yes! While we’re still in the early stages of working on the 3D Foundational Model, our first steps will be to generate individual 3D assets from scratch, and in the longer term, we plan to work on helping create full scenes.
Our goal is to build great support in Roblox Studio, out of the box, for working on your code as files. This means letting you use your favorite IDE, and its plugins to edit the code in any Roblox project.
Before the end of the year we plan to share the first preview of our new approach, as an opt-in Studio feature. As this is a very early preview this won’t support all of your workflows, but we want to make sure we are developing this transparently and sharing our progress as we go to continuously get feedback. This build will include basic support for right-clicking on a script, and syncing it bidirectionally with a file on your disk when you save.
Like you said - we’re late on this. The real reason for this is - we weren’t happy with our first attempt and went back to the drawing board.
For those of you who are interested in why - I’ve included some more info in the drop down below.
What was wrong with the first version of File Sync 1
- File watching is hard, bidirectional sync with file watching is even harder when you factor in Team Create. We tried to skip this by making the file sync manual (i.e., click import / export). This led to a pretty janky workflow with a high risk of data loss.
- We tried to provide one tool to sync scripts in the whole DataModel to files. The challenge is that the DataModel is not a file system. Tools like Rojo have invested a lot of time building practices / conventions for this. Our attempt to do this in a way that would work on any Roblox project was not convincing.
- File sync was some widget with controls you needed to operate, rather than being something that fit into your workflow more naturally
For those of you who don’t need the details - the short of it was, the feature didn’t feel right. Those in the community that volunteered to test it told us the same.
As we gather feedback from this first preview, and build out our full roadmap, we’ll release our progress regularly.
Thank you for sharing your experience with us. We take the privacy of our users very seriously and are evaluating changes to our verified badge feature based on what we’ve been hearing from the community. We will share more in the future, but would love to hear your thoughts around what functionality you’d be looking for here.
We are not currently looking into this, but thank you for the feedback. We’ll add this to our feature request list, and we note all the interest during this AMA!
We want to improve our material system over time, which may involve shaders or other approaches (e.g., parametric materials), and we want to ensure that our solutions work performantly across all platforms. While we do not have anything related to shader support in active development, we may potentially revisit this in the future. Thank you for the feedback.
In Q4, we’ll pilot our Shopify commerce integration with a small number of creators and brands. Early next year, we’ll expand to enable any eligible creator to sell physical merchandise directly from within their experiences to users in the US ages 13+.
Creators will be able to attach a digital benefit like an avatar or in-experience item from Creator Hub when you set up your commerce products. You’ll also have access to in-experience APIs and webhooks to help you manage merchandising, checkout and order confirmation.
When we launch paid access experiences in fiat later this year, users will be able to buy on desktop and play on any platform (including Xbox).
Hey, thanks for the suggestion! This is not currently on the roadmap, but we will certainly look into something like this if we end up doing more mega posts like the RDC wrap-up in the future.
Thank you for this question! We are piloting an avatar creator feedback group now and hope to roll out a broader channel in the near future. More generally, we’ve been trying to spend more time with UGC creators at events, such as at Gamescom LATAM and RDC, and I’ve definitely been learning a lot about the challenges facing the community.
We are not currently working on dynamic clouds, but could you let us know if you have specific feature requests? We’ll make sure they’re on our radar for consideration and future roadmaps.
We welcome advertising on our platform from all brands and creators that adhere to our Advertising Standards, Community Standards, and Terms of Use.
The Engine Open Cloud API for Executing Luau will allow you to run a Luau script against a place in the Roblox game engine via Open Cloud. You specify a command, we spin up a server, load the place and run your code. You can then request the logs and return values.
We’ll have more details on this very soon as we are close to announcing our Beta.
Note: the term ‘universe scripts’ has been used in a few different ways over the years at Roblox. This isn’t something we’re actively working on right now, but we’d love to hear more about what you are looking for in a feature like this.
Thanks for your question! Here’s how the Open Cloud Engine API for Luau execution works:
- You make a request to create a task - including the universeId, placeId, placeVersion and the string you want to run as Luau.
- We launch a server, load the place, and execute the code with DataModel and Engine Luau API access.
- You poll for the operation status, and upon completion, receive the script’s return values and logs.
We’ll provide sample Python scripts to make this all nice and easy, and more details soon!
As part of sunsetting Compatibility Lighting, we recently added a ColorGradingEffect instance, which can be used for tone mapping. Thank you for the feedback around the other effects – we’ll add them to our feature request list, but we’re not currently working on them.
Those use cases are great, and things we aspire to support over time. However, as related to direct access to specific HW, one of our core principles is that the experiences you build should be able to run wherever Roblox runs, from Playstation to low-end-mobile to high-end PC. Using specific hardware instructions could prevent an experience from running if the hardware isn’t available, so we would approach this from a higher level of workload mapping to available HW.
Thank you for the question. We are enhancing Avatar Auto Setup to run on partially complete models, and hope to release this by Q1 2025. We are really looking forward to seeing how the community uses it!
Additionally, we’re working on a long-term plan for modesty layers and 2D clothing, but we can’t share a specific timeline at this stage.
Thank you for your question. We are working on improving alt detection using multiple techniques to make it as reliable as possible while still carefully protecting our users’ privacy. Unfortunately, we can’t comment on specific approaches at this time.
We are very excited to make these improvements. We know they’ve been a long-time coming and want to make sure we get them right. In the near future, we plan to roll out visual improvements like alignment (i.e., the zoom and border padding) and lighting improvements that would be standardized across each category.
Looking further ahead, we are exploring giving creators more control over these thumbnails (such as selecting camera angles, background colors, and poses).
Thanks for this feedback. Presently, we have no plans to enable uploading of audio assets from within an experience. Creators can continue to upload private audio that complies with our policies, or use the growing library of public audio from the Creator Store from distributors like DistroKid and Monstercat.