Nick and Tian here! We are excited to update you on what we announced today at the Roblox Developer Conference (RDC). We also want to celebrate the incredible things you created over the past year, from experiences to avatars to assets and tools to events you put on for the Roblox community. It’s been an incredible year, and you’ve moved the platform to another level.
Before we get into all of the announcements, we want to talk a little about what’s happening behind the scenes to help improve your day-to-day experience creating on Roblox. Our main focus is to make the platform you build on daily rock solid from a security, stability, performance, and reliability standpoint.
That starts with our bug process - almost every bug you file now has an owner. We’ve started tracking what we call uninterrupted studio session time. The time you can work in Studio without a crash has more than doubled from around 30 hours last year to 80 hours this year1 and we did that by reducing the crash rate by 50%2. We’re working on hangs, launched team create auto reconnects, improved scripting latency, and added support for Apple Silicon but that’s not all.
We also announced Secret Store for HTTPService, a new service to securely store secrets like API keys and credentials separate from your code. You can set your secrets in the Creator Dashboard or through Open Cloud APIs and refer to them in code by a reference key.
Coming later this year: Secret Store for HTTP Service will securely store your secrets like API keys and credentials separately from your code
We want to ensure that we’re building solutions that fit your workflows, which means that we’ve been putting a lot of effort into improving Studio and making our platform work with the tools you love to use. Recently, we enabled OAuth support, released a Blender add-on that synchronizes your changes to your models with Studio, and released full support for glTF yesterday.
Since we know that more of you are regularly collaborating with a team, we’re working on real-time collaboration solutions and will be rolling out live scripting in Team Create. We are also working to roll out the ability to transfer experiences from accounts to groups and, later, other asset types.
We are excited to announce that soon you will be able to edit and update your Roblox files in your favorite code editor or integrate with your source control system. This has been one of your most requested features, and we’re excited that it is coming soon.
We are extending our Open Cloud APIs to include APIs to access your Data Model. This will open up new use cases. For instance, imagine being able to update what items you are selling in your experience shop by changing a few cells in a Google sheet.
Coming later this year: You will be able to sync files between Studio and code editors and source control systems
Every day, more than 65 million users join Roblox using a wide range of devices that includes mobile (iOS and Android), desktops, and Xbox. Soon, Roblox will be accessible to even more people to enjoy - including Playstation and Meta Quest.
This opens up another opportunity for developers to create and share their experiences with millions of people instantly on PlayStation and Meta Quest. In the coming months, Roblox developers will be able to bring their existing experiences to PlayStation and Meta Quest or create unique new experiences with console and VR devices in mind.
Roblox has always strived to make creation accessible to everyone. But whether someone is a new creator starting or an experienced one, there are still barriers to creating. Beginning creators often struggle with learning to code, and experienced creators must constantly innovate and create new content for their experiences.
Earlier this year, we launched betas for generative AI-powered solutions: Code Assist and Material Generator. Since releasing Code Assist, we’ve seen double the amount of code generated compared to our previous autocomplete solution. Creators using the beta for Material Generator, which enables them to generate material variants with text prompts, increased their use of physics-based rendering (PBR) material variations by more than 50 percent compared to creators not using the beta. These early solutions helped show the promise of generative AI and its ability to help creators be more productive and creative.
We’re excited to announce Assistant, a conversational AI that empowers you to find answers, explain, debug, and iterate on code, and build immersive experiences faster. You can interact with Assistant across Creator Hub and Studio. You’ll soon be able to interact with Assistant within our docs, helping you learn faster. Working with Assistant will be collaborative and iterative, so you can provide feedback and Assistant will work to provide the best solution.
Mock of Assistant re-texturing a bear based on creator prompts
Next year, you can use the same APIs that power Assistant to power your in-experience creations. Also, to build world-class AI models and empower all AIs to speak Luau, we’re rolling out a voluntary opt-in program that allows you to contribute some of your scripts from select experiences to either or both our AI models and a public Luau data set. People who opt-in to sharing their data with Roblox will get access to enhanced AI models behind the scenes. Sharing with the public Luau dataset helps elevate Luau as a language and ensures that future AI models or LLMs can speak Luau.
In August, we launched publishing packages directly from experiences and the ability to save them to your inventory. Today, we showcased how publishing packages to inventories and creating avatars in experience is just the beginning. Ultimately, the goal is to empower creation everywhere so that you can take what you created with you from experience to experience. Imagine one experience where you design a fabric. Then, you visit another experience where you design a coat with your fabric. Finally, imagine an experience that lets you build a whole shop for your designs as a standalone experience. We see this unlocking new creation possibilities and economic opportunities for you.
One of the unique things about Roblox is that it is a world simulation engine that allows anyone to create rich, immersive, realistic experiences.
Coming off the heels of our announcements of global wind and aerodynamic forces, today, we previewed innovations that will be available next year, such as hydrodynamics that allow more realistic interactions with water and prescribed wind capabilities and the future of our lighting that adapts to ensure the highest quality lighting for every device. We also announced that one of the most requested features of all time, the ability to control the height of the grass in experiences, is coming very soon as a non-scriptable property and later as an API.
Image of grass height being adjusted in Studio
To build truly immersive experiences, audio and video play a significant part. Today, we previewed a new enhanced audio API that allows you to create multiple sound sources with existing audio assets and live voice sources. It enables new opportunities for creativity with in-world sound listeners, such as microphones that can pick up sounds and be wired to rebroadcast sound throughout the experience. Later this month, we are launching short-form video uploads accessible to developers who are 13+ and ID-verified. You can use it for tutorials, cutscenes, or background decor to take immersion to a deeper level.
Today, we showcased new mesh and texture APIs to help you morph the shape and dimensions of your avatar or change the textures of items within experiences. Within the next few months, you can adopt these APIs to allow people in your experiences to customize avatars. This will start with the customization of avatars themselves—adding freckles or changing the shape of your jaw—then expand into the in-experience creation of avatar items, clothing, and emotes.
Coming later this year, we’ll launch a new default avatar movement system. Over time you’ll get new actions such as duck and crouch, improved swimming, and object interactions. And the system will automatically improve with better physics integration, ML-powered movement, and continually improving performance. It will enable people to explore more naturally and in tune with the experiences you’re creating.
We are making it much easier to import and preview Avatars you make with third-party tools in several ways. We now support the standard gLTF files in addition to OBJ or FBX.
In the near future you can look for 2 things: we’ll enable tools to take that imported avatar mesh and automatically rig, skin, cage, and segment your mesh for a fully animated avatar. We’ve also heard your frustrations around accessories that don’t easily fit all avatars. We are working in it, and there will be an avatar accessory refinement tool to help fit accessories perfectly to any avatar.
Finally, we are now exploring generative AI technology that makes avatar creation easy. Dave showed you today how anyone will be able to create an avatar from a photograph and a text prompt, customize it, and add it to their inventory to start using it. The future of avatar creation is as simple as snapping a selfie and uploading it to Roblox.
Launching later this year, Roblox Connect will be a new way for friends to call each other as their avatars in a shared experience. With Connect, people can call a friend using their real name and be transported to a shared immersive space for their conversation. The device’s camera is all you need to capture and translate your motion in real time, so your facial expressions and body language can convey nuance.
On Roblox, people spend, on average, 2.3 hours a day3, and we are adding real-time communication to experiences that will help drive the vision of making Roblox a daily utility. We also see new opportunities for creators who make avatars and accessories, as avatars will be featured even more prominently.
We will offer a suite of APIs and reference code examples to support developers in building experiences for communication and connection. Some of the use cases these APIs will enable long-term are:
- Giving/removing permission for a given user to make calls from your experience.
- Showing/hiding the user’s self-view so they see what they look like to their friends.
- Build a phone booth location for users to call their friends to join them in their experience.
- Configuring what happens when a user answers a call (e.g., teleporting to a location).
In 2024, we plan to support real-time animation of your avatar’s upper body using the same webcam video stream we use for facial animation.
Later this year, you will be able to offer subscriptions within your experiences. You could sell subscriptions ranging from granting access to certain exclusive features, enabling special abilities or powers, or creating an experience only available to subscribers.
We are expanding participation in our economy so that anyone who is ID-verified and has Premium can create 3D items for our Marketplace. We are also developing easier ways for you to detect duplicates of your items and more easily report, manage, and track IP claims.
We are also evolving our Marketplace so that by next year, there will be two main types of items: ones that are fixed in quantity and ones that have a flexible quantity, both with costs per unit to publish.
As we announced at last year’s RDC, we launched Immersive Ads this year. And while it was a year of learning, we are seeing early signs of success. Developers that rank #100-1000 based on time spent that are showing portals are earning over 10% of their total earnings from immersive ads.
For those concerned about whether ads might affect in-experience earnings or homepage rankings, early results suggest that creators who adopted Immersive Ads are seeing neutral impact on in-experience earnings or homepage ranking. This means that creators that combine Immersive Ads with in-experience monetization can earn more.
We’re focused on increasing payouts by driving higher advertiser budgets, giving publishers more control (price floors and exclusions), better reporting, and improving a user’s teleport experience.
And we have heard loud and clear the feedback from our community about Creator Marketplace fees. Next year, we plan to update our Creator Marketplace fees to enable you to keep 100 percent of the proceeds from the sale of your assets sold in Studio or Creator Hub—minus sales tax and payment processing fees and will no longer be subject to DevEx fees. We will also allow you to sell models in addition to plugins and change to buying and selling assets in US$ instead of Robux. We’re starting small, but our hope is to build a vibrant economy of creator-to-creator exchanges. We want to help provide you with ways to diversify the ways you can earn, and we plan to pass on cost efficiencies through savings or higher payouts where we can.
Today, we shared new tools to maintain a safe environment as we roll out new features. We’re also continuing to improve our moderation accuracy to ensure that we’re ensuring a safe environment for everyone while minimizing the impact of moderation on you.
We’re building tools to empower you to manage your experiences and make decisions based on context, such as what’s appropriate for your audience. Today, you can use the
IsVerified API to better manage access to features within their experiences based on a person’s account verification level. This can grant access to specific parts of the experience or help them better manage in-experience moderation. We’ll soon launch a
Banning API that will provide you with advanced features to proactively enforce your experience’s guidelines by controlling access to your experience.
Later this year, developer-configured text filters will help you determine what type of communication is appropriate for your community and experience. This work is part of our ongoing investment in providing you with the tools you need to build safety and civility into everything you do so we can create the most civil immersive communities in the world.
We’re excited to share these updates with you and can’t wait to hear your feedback. We will update the Creator Roadmap shortly so you can keep track of when some of these products will launch.