The pressure is on as the teams wrap up their hacks and think about how to present their projects – it’s a race to the finish line and time is ticking! (ICYMI, check out the first episode where we introduce three of the teams.)
Roblox Founder and CEO David Baszucki, Chief Product Officer Manuel Bronstein, and Chief Technology Officer Daniel Sturman visit a few of the teams to check in on their progress.
As promised, let’s dive deeper into a few of the Hack Week teams’ projects, both from the docuseries and a handful of others. Please note that these Hack Week projects were created to spark innovation and we want to share these with you solely as a way to showcase our culture of innovation. We are NOT guaranteeing that we’ll build this technology into our products in the future. You can head to our Creator Roadmap for information on upcoming product features.
Take a look at how the teams describe their projects below and feel free to show your love, ask questions, and let us know — who has what it takes to win?
Asset & Tool Subscriptions
From @rockythecorgi68 and team:
Click for more details.
At RDC 23, we announced subscriptions within experiences. Asset and tool creators who build and sell developer tools in Creator Store expressed interest in subscriptions as well . So, our project focused on enabling creators to have recurring revenue from plugins by charging a subscription, in case other creators only need it for a finite amount of time. Or, they could sell the asset or tool as a one-time purchase.
We had trouble with the project’s technical scope and determining what aspects would be achievable given our timeframe, so we prioritized making an MVP and saving more difficult tasks for later. Making every aspect of the subscription flow feel consistent was challenging — our computers kept crashing from trying to integrate multiple screen recordings and animations.
Our goal was to provide more creators access to reliable income streams and support them in producing quality tools within the Creator Store.
Roblox Live
From @Nexx @christ0pherus and team:
Click for more details.
Ever feel like you’re endlessly scrolling through Roblox, struggling to find that perfect experience? Our team built Roblox Live to spectate any experience in full 3D as if you were really there. From Home, you can preview and move around in an experience picked from your recommendations. You can also preview where your friends are in real time. Once you join an experience, you’ll be seamlessly transported to the spawn point.
To create Roblox Live, we built a new tile on the Homepage that would take you into a pre-selected experience to spectate. We modified TeleportService to allow seamless transitions between games and used Roblox Studio for testing camera controls and used React-lua to build out the UI. Our biggest technical challenge was that assets in the world would take a while to load, so we decreased the graphic settings which allowed a user to join the experience before the assets had fully loaded. Other challenges included camera control and UI being overridden by the developer, leading to inconsistent results depending on the experience. We realized there’s no one size fits all solution for spectating that works for every experience.
We see spectating as a new feature for devs to curate their experiences for potential players. Managing spectator behaviors could be facilitated through a developer API, empowering developers to customize camera subjects, perspectives, and comprehensive control over information that’s visible to spectators. Long term, we could also see Roblox Live as a new way to engage with influencers, through a live chat that would allow spectators to drop gifts or hinder game play.
Snoblox
From @eronaught @TheRabbitDeveloper and team:
Click for more details.
Do you wanna build a snow…ball fight on Roblox? Our team’s project involved bringing physical materials to the platform by implementing the Material Point Method (MPM). Materials in MPM are represented as particles without connectivity, allowing for deformation and fracture. We see utilizing the existing particle system as an opportunity to provide developers with a more dynamic materials system.Our project unlocks a new world of possible materials for developers to express themselves with. From shifting sand to fluffy snow, the sky’s the limit.
Much of our time was spent debugging the snow physics, and we heavily used Paraview to examine our simulation results. The biggest technical challenge was debugging the snow physics. After our initial implementation, we found that every time we would start a simulation, it would almost immediately become unstable and “explode” (snow particles would start moving very fast). By visualizing the results in Paraview, we could examine exactly what forces were applied to each particle. The other technical challenge was achieving adequate performance to have the simulations run in real-time. We addressed this by adding multi-threading to most steps of the simulation. This allowed us to fully utilize the hardware and simulate much larger amounts of snow.
Studio Web Plugins
From @FriendlyAdder and team:
Click for more details.
Our hack week project aims to connect Studio and Web workflows with deep integration of Creator Hub and Studio to make creation even easier and more fun. We integrated Chromium Embedded Framework (CEF), which drives apps like Spotify and Steam, with Roblox Studio to render the web content. We reimagined Toolbox (Creator Store), adding animated video tiles, client-side interactive analytics charts, and markdown for asset descriptions. Looking ahead, we see a future beyond Creator Store, where this is a first step in further merging Creator Hub and Studio.
A big part of our hack was exploring the crossover between engine and web rendering. We implemented interactive asset preview as a ViewportFrame rendered into an HTML canvas, with working controls. Passing inputs and frame data back and forth with clean abstractions was a fun challenge. We pushed this as far as we could, rendering a 3D scene running with WebGL and WASM to an off-screen CEF window, which is then shown in an EditableImage at a drive-in theater in the experience we built. We were able to use the Chromium developer tools throughout the project, which let us iterate rapidly.
We think that the community could benefit from more seamless creation workflows, and that shared UI codebases and surfaces can aid that vision. There’s also potential to enable developer tools and plugins with non-native UIs from outside the platform.
3D Marketplace
From @PerfectlyCromulent and team:
Click for more details.
The current Marketplace is 2D, relying on static thumbnail images – nothing fancy. Now, imagine a 3D marketplace. It downloads assets just like it would if it was running an actual experience. It renders avatars to show what items would look like on your avatar while moving. You could actually preview what you’re going to buy in a way that hasn’t been done before. When you’re selecting an item, it can rotate and bounce – we used particle effects inside the marketplace. A fluid interface like this could make your existing avatar preview feel like a shopping companion.
To build 3D Marketplace, we created a new page and loaded the assets into 3D space. We wired up the existing metadata, such as title, price, etc. into 3D space engine components and wired up existing detail and purchasing flows to those elements. For bundles, we rendered the user’s current avatar wearing the items as well, so that the user could see a real preview of the content. We also added an avatar preview that would flip up from the bottom of the screen and wave to the user to show what any item or bundle looks like on the current avatar while it’s in motion.
Taking this project a step further, we would identify production quality options for integrating local rendering with the current 2D marketplace implementation. Our goal would be to get localized and dynamic item cards without having to completely reimplement the marketplace in 3D space.
Data Whisperer
From @yxjiang and team:
Click for more details.
We built an agent-based knowledge discovery system called Data Whisperer. Data Whisperer is a tool that can help people quickly get insights via natural language conversation. Data Whisperer operates exclusively within our secure internal environment, ensuring data privacy is maintained.
This tool can support internal knowledge discovery and would be able to offer potential benefits to the creator community. By providing valuable data insights, Data Whisperer could assist creators in identifying and leveraging growth opportunities.
The primary technical challenge stemmed from the ongoing development within the field of Conversational AI, a domain still being actively researched and explored. We managed to address some of the data inquiries using existing techniques. Our approach will need incremental improvement, adapting and evolving our solutions as new advancements in technology become available. Our immediate next step would be to develop a more mature version of Data Whisperer that can support simple use cases internally.