We’re excited to announce that our new haptic capabilities are now available as a client beta! Thank you for all your feedback and excitement during our Studio beta in the past couple weeks. During this client beta, you’ll be able to publish experiences with our enum-based haptics, custom waveforms, and integrated Gui haptics. All developers will now be automatically enrolled in the haptics beta feature to have easy access to the APIs and instances – we’ll remove this beta feature when we move to full release.
As a refresher, haptic effects refer to tactile feedback – such as a buzz or a rumble – used to enhance immersion in a game. We’re delivering:
Predefined enum-based haptic effects: We are introducing five out-of-the-box haptic effects for you to add to your experiences. They will work across all haptic-supported input devices like Console and VR controllers. The five effects we’ll introduce are: Explosion, Collision, UIClick, UIHover, and UINotification.
Customizable waveform effects: We are giving you the ability to customize your own waveform haptic effects. This API abstracts away hardware details, so custom waveforms will work across haptic-supported input devices, given hardware limitations.
Integrated Gui haptics: A no-code method of integrating haptics into your game UI, via properties HoverHaptic and ClickHaptic on ImageButtons and TextButtons.
You can read more about our Studio beta launch here. In our old post, you can also find instructions on how to use these APIs and a sample placefile which uses multiple haptic effects (attached here for easy reference). You can also check out our documentation here.
What’s changed since the launch of our Studio beta
We’ve fixed the following issues during our Studio beta:
HapticEffect is now working in Team Test
PressHapticEffect is now working with Virtual Cursor mode enabled
We also wanted to provide an update regarding which specific combinations of client platforms + input devices that do not yet support haptics:
All game controllers connected to MacOS 15+
VR controllers connected to PC
All game controllers connected to mobile devices
FAQs
Can you give us some examples of how to design a waveform curve?
Here are some examples of how to create haptics for intended effects. Feel free to use these as a starting point to create haptics that meet your desire!
See Examples
A light buzz
Time: 0ms, Intensity: 0.3, Interpolation: Linear
Time: 30ms, Intensity: 0.3, Interpolation: Linear
A strong rumble
Time: 0ms, Intensity: 1, Interpolation: Linear
Time: 150ms, Intensity: 1, Interpolation: Linear
Time: 175ms, Intensity: 0, Interpolation: Linear
Time: 200ms, Intensity: 0, Interpolation: Linear
Time: 225ms, Intensity: 1, Interpolation: Linear
Time: 375ms, Intensity: 1, Interpolation: Linear
Time: 400ms, Intensity: 0, Interpolation: Linear
Time: 425ms, Intensity: 0, Interpolation: Linear
Time: 450ms, Intensity: 1, Interpolation: Linear
Time: 575ms, Intensity: 1, Interpolation: Linear
Time: 600ms, Intensity: 0, Interpolation: Linear
Time: 625ms, Intensity: 0, Interpolation: Linear
Time: 650ms, Intensity: 1, Interpolation: Linear
Time: 775ms, Intensity: 1, Interpolation: Linear
Time: 800ms, Intensity: 0, Interpolation: Linear
Why are haptics not registering on low intensity?
Input devices all vary in the number of motors they contain and the granularity of intensity they support. On certain clients, haptic intensity below 0.1 may not trigger any haptic effects.
Known issues
Haptic feedback does not trigger consistently in voice-chat enabled Roblox places on iPhones, even if the mute button is turned on. We are actively working on a fix.
What’s next
We’ll keep you updated here once haptics are fully released. We can’t wait to see all your creative implementations of haptics!
Ok this I can understand. Haptics service was tricky to use, you had to check for support, and it was difficult to update the intensity every frame.
These instances improved what Haptics service didn’t do well at.
I don’t want to go too off topic but UIS can do everything that IAS did. While these new haptics instances make it incredibly easy to use compared to the old methods.
Explosion is one of the haptic types. When played it will shoot up to a high intensity and is slight longer when it fade off. Simulates a meaningful explosion in the environment.
This sounds really good!
Would Haptic effects pair well with GUI effects onscreen in the future? I would love to use Particles for any viewport effects solely on screen.
Super cool seeing more work done with haptics! Here is to hoping we get access to even more features in the future such as adaptive triggers, force feedback, and other effects
This was going to be my only question. I spent a good 30 mins tweaking values thinking it was a Roblox thing in studio from the studio beta. It sounds like it’s a controller thing.
Could someone recommend a controller that allows the lower intensities? I’m looking to buy a new one, Xbox one type preferred.
It’s haptics or controller vibration, a newer version to allow more control and easier implementation in some cases. Also mobile vibration iirc.
Adding it gives a layer of immersion. Like if there’s an explosion, then you can make the device vibrate. Some like it, some don’t. You don’t need to implement it if you don’t want to. This announcement just is saying this new vibration system works on the client now, instead of just in studio. Hope that all makes sense as a summary of it.
Just out of curiosity, how does Roblox internally handle controller inputs? As far as I know Windows’ xinput api is incompatible with Playstation controllers - are both DirectInput and Xinput being used when xbox and playstation controllers are plugged into the same machine, or is there some type of HID-level access that can be achieved regardless of input device? It must be a nightmare for you guys to keep everything uniform cross platform.
These are Haptic objects! Haptics make some devices shake. A example is you can use a haptic to make a game controller vibrate. Keep in mind not all devices can vibrate.
You can hook the HapticEffect to a button and the effect will play when you click with no scripts. There is a property in Ui buttons now so you can connect I believe.
I doubt it, but Unity, Roblox Studio, and the SDL library all have similar systems of input handling with Enums. Like the KeyCode system and events overall are nearly identical.
there are many versions of sdl, most recent is around 3.0+ so the one I linked is outdated.
Since this is instance based it is entirely optional. Just like the new Input Action system is entirely optional or the new UIStyle system that just released, they are all fully optional and aren’t required to be used by the developer if they do not wish to implement them into their games.
I’m experiencing a very strange bug at the moment with haptic playback on iPhone 11.
In a published version of the provided HapticWorld.rbxl place, everything related to playing haptics works. But, as soon as I use a HapticEffect in my game, it appears that playing a LoopedHapticEffectof type GameplayExplosion, doesn’t work as intended. I’m unsure if other types are affected.
Playback only happens briefly during game load and game leave for about a second. With Looped turned off, manual calls to Play (called when the player presses a button) also do not start playback. The rest of this post refers to a looped effect.
There’s also weird edge cases where playback does occur. My phone’s battery reached 10%, and I got a standard iOS “Low Battery” modal with the option to close it. While this is open, playback does occur. It goes away when I close the modal. Opening control center also allows playback to occur until it’s closed.