Animate your Avatar with your movement

It’s been over two weeks and my main account has still not received this feature. I am over 13 and have been age verified since the process was introduced. Additionally, I have not received any form of justified moderation in at least two years. How long do I have to wait for this to roll out to me? Seems as though everyone else has been given access and I still don’t see the green dot under the Privacy Tab in Settings.

6 Likes

Rollouts usually take about a month from my experience. It’s random from what I’ve heard.

2 Likes

It could be very successful in the future, but it can also carry risks of being used for malicious intentions.
So though it’s good to animate avatars with your movements, there are of course players who will abuse this feature.

But I do say it’s a great feature if used correctly in the future.

3 Likes

This doesn’t seem to be true, I can’t get facial animation working in Studio. (Which is making debugging very difficult as I can’t observe the DataModel.)

It would be really helpful if this stuff was easier to access in Studio as well. Team Test kinda sucks because it forces you to commit your changes and save the game. I just want to be able to check how facial animation behaves in the DataModel, what properties it modifies, whether a change I made was compatible, etc. etc. while still doing things efficiently in a way that is temporary.

2 Likes

Sorry for the inconvenience, we’re currently investigating an issue in Studio, once resolved facial animation from camera input in play mode will be enabled in Studio again.

In the meantime regarding how facial animation behaves you can add a dynamic head avatar to your workspace, uncollapse the head item in it and there you’ll see it has a FaceControls item in it.

When you click that and look at the properties you can see it has a bunch of FACS properties there for which you can adjust the values between 0 and 1 and you can see the expression of the avatar’s face changing accordingly.

So that’s how the facial animation works, by adjusting the values of those FACS properties via animation input.

You can also open the animation editor and select the dynamic head avatar in the workspace, then create a new clip and then in the animation editor click the Face Editor Button to adjust facs properties for the face there or press the Face Capture button to create an animation from camera tracking input.

How Self View works: It looks for an item in the avatar with FaceControls or as fallback for an item named “Head” which should be a MeshPart or Part. Then it focusses that in front view.

Some further reading and info:

Hope it helps =)

4 Likes

Great thanks :slight_smile:

Thanks for the explanation, but those properties are not what I was talking about.

Rather, I am talking about details of how the Facial Animation system affects the character model, rig, and what things it relies on in the instance hierarchy to work.

For example:

This kind of information is critical in any game that does non-trivial things with the character model, and Studio provides tools to assist in figuring these things out. Which is why not being able to test in Play Solo / Local Servers is very bothersome. Team Test is workable, but it’s higher friction (you need to commit all script changes) and has less functionality (e.g. you can’t test with multiple players on a single device like you can with Local Servers).

6 Likes

Thanks for the detail, yeah, i totally see that would be very useful to check during play mode in studio, we’ll get that going again asap.
Regarding your points list:
-Yes, when animation from camera input is enabled it looks for a “Neck” joint and alters the cframe orientation of it to make the avatar’s head turn according to how one turns one’s head.
-When using mic only and no camera one only gets mouth animation lipsync, so then no head rotation applied.
-Facial animation from camera input should still work when the neck gets disabled
-thanks for reporting that issue, we’re looking into it.

5 Likes

Yeah you’re right, I guess I was mistaking it for a different problem.

5 Likes

Glad to hear that part works as expected, and thanks again for the input and feedback and reporting the other issue :+1:

6 Likes

hello, just a quick update: the related issue in Studio has been fixed and we enabled camera/ mic usage in studio play mode again.
To use it you can go to Game Settings/Communication and enable Camera/ Microphone there, save/publish to Roblox, restart Studio and then you should be able to use animation from Camera in Play Solo, Local Server and Team Create Team Test and Mic in Team Create Team Test by using the bubble chat mic/cam buttons.

5 Likes

Awesome, it’s now all working in all Studio test modes :slight_smile: I’m impressed with how quickly you were able to make the change haha

4 Likes

Thanks for giving it a try, glad to hear =)
And yeah, was beautiful to see our team pushing together to fix it and test and roll out the fix so smoothly, love it when that happens, makes me feel like part of a well coordinated orchestra =)

All our teams are pushing on so many amazing things to make Roblox that bit better every day.
I know like everyone we have our ups and downs, sometimes some things are not ideal right away, sometimes some need fixing, sometimes slight adjusting and sometimes bigger course correction, but yeah, when it all sings in harmony, it can be magical =)

5 Likes

I wanted to cross-post a bug report I just created as the “Enable Camera” setting blocks the ability to set any Motor6D.Transform of a character, regardless if they are using the camera or not. This is a problem that currently blocks games with Nexus VR Character Model from being able to use facial animations.

3 Likes

I still haven’t been given the option to enable this feature for my own account but I can enable it games through studio even though I have no way of testing it. Is there any way to request access to this feature?

4 Likes

For additional context, I already have had voice enabled since about when it came out but I still haven’t gotten the facial animations enabled.

3 Likes

Hi! Just asking about the developer/user rollout. I’m not sure what qualifications one would need to get developer access, however, I have still not received the facial animation support on my account. I am 13+ and ID-verifed.

3 Likes

Update on UI: We’ve released an visual update to the bubble toggles to address the feedback on UI visuals. Big thanks to @liztapioca7 and @daweezy99 for working on these changes.

For those who still wish to remove the bubble toggle UI altogether, don’t worry! You will later be able to with the full release of the new experience controls.

4 Likes

Looks better for sure but there are 2 very small issues I’ve instantly noticed with it:

1: At a certain zoom level the size of the bubble changes weirdly (getting bigger or smaller) (happens with any body type, r15 or r6), see video

2: This only happens in the R6 body but when R6 is enabled and you zoom in close enough, the bubble chat is halfway inside your head (which you can see at the end of the video)

1 Like

Why is the facial animation feature only being rolled out to certain users?
I still do not have the option to test this feature.

5 Likes

Glad to see that the UI has been fixed. However, (from what I can see while reading this) many people, including myself, do not have access to this feature. The toggle does not show up in the Privacy page in settings. It only shows the “Microphone Input”, this has been the same since it’s release. I know you said it was being rolled out, but this was announced over a month ago.

Any ideas why this might be?

2 Likes