We want to make it easy to create cross-platform experiences that take full advantage of each platform’s unique capabilities. For VR, this means being able to animate your avatar with your head and hand movements.
Today, we’re adding a new API: VRService.AvatarGestures. When true, this API will allow VR players’ avatars to follow their movement in first and third person. Other players around them will also be able to see their movements. As a reminder, if players feel uncomfortable by any gestures made, they can toggle the safety bubble in the bottom bar to turn nearby avatars invisible.
Note: If you have forked PlayerScripts, we recommend merging the latest updates in to get the new camera changes.
VRService
To support this change, VRService is no longer only a client-side service. VRService.AvatarGestures must be toggled server-side to be active, and other previously client-only APIs such as VRService.AutomaticScaling can now be toggled on the server and replicated to clients.
Furthermore, you can now add VRService as an optional service to the explorer by right clicking on the explorer and toggle these settings in the studio UI.
AvatarGestures
AvatarGestures must be toggled on the server. This new feature will add several parts and IKControls to the VR players’ avatar to animate their avatar and replicate their movements to other avatars. This feature will respect a player’s existing comfort and control settings.
Feedback
As always, we welcome your feedback on this new feature and overall VR experience. We’re excited to see how you will utilize this new avatar animation feature to create more immersive and engaging VR experiences.
Awesome update! It will save everyone some time by making a VR system a core-feature! I’m just wondering if this feature will be toggled on everywhere after some time.
This is awesome. It solves the big use case that Nexus VR Character Model has tried to solve: immersion. Ever since the Meta Quest’s launch, I’ve been critical about how immersion-breaking not having your character’s movements match your real movements was. This existing and being native with a single property change is huge.
But… it isn’t perfect. As of my testing yesterday, teleporting players is completely broken. If you happen to move the character by using movement inputs (joystick, probably keyboard), you are fine. Once you use HumanoidRootPart.CFrame = ..., instead of your camera and character moving, your character moves, and then tries to move back to where your camera was. That is until you start moving, then the camera snaps to where the character is.
I can reproduce this on a baseplate with a simple “Button X to teleport” script.
game.UserInputService.InputBegan:Connect(function(Input)
if Input.KeyCode == Enum.KeyCode.ButtonX then
local HumanoidRootPart = game.Players.LocalPlayer.Character.HumanoidRootPart
HumanoidRootPart.CFrame = CFrame.new(0, 0, -10) * HumanoidRootPart.CFrame
end
end)
It is harder to see, but doing game.Players.LocalPlayer.CameraMode = Enum.CameraMode.LockFirstPerson will also yield the same result.
For those who use Nexus VR Character Model, all versions (V1 to V2.10) are probably incompatible with this. V3 will exist at some point to provide some feature parity (teleport controls, custom backpack, and a bit more) while supporting this. (Edit: For the adventurous, a preview version is out)
This is cool. Nice that we finally dont have to write our own code to make this work. How would animations look with gestures active though? And does it work for R6? (probably not)
However, if you don’t need the teleport controls it might be time to switch off. The custom backpack is not tied directly to Nexus VR Character Model, and I think Roblox’s implementation is going to be enough for most cases (maybe not roller coasters though, as I don’t recall the camera being allowed to roll with the seat you are on).
(Edit: I mistook HeadLocked for Camera.VRTiltAndRollEnabled, which Nexus VR Character Model already enables.)