Animate your Avatar with your movement

im still really angry about the roblox support not giving me this feature and tries to change subject by telling me about the feature in roblox studio and not in the actual roblox privacy setttings.

4 Likes

I regret to bother you, but are we going to receive a proper response on Roblox’s end about the off-saleing of old faces* and the jacking up of prices for new ones? The community has gotten nothing but a vague response about “expression” and “benchmarking” from your coworker SergeantBlocky. We would really appreciate it if the old faces were put back on sale and the new ones had a price that matched the original. Thank you.
image

10 Likes

I don’t have a problem with hats as they have not been taken offsale, but rather they have been converted to Accessory instances.

4 Likes

I think he simply made a typo, he probably meant heads

6 Likes

I think it should only be visible when the mouse is close to it. For example, when you move your mouse towards the middle of the screen, and your not pointing at a ScreenGui element, the voice chat controls would appear. When the opposite happens, it would disappear.

3 Likes

I think that this feature will be a great addition to the forms of communication available on Roblox. But as cool as this feature is, the in-experience UI is seriously lacking. It is very inconsistent with the rest of the Roblox UI, and looks unfinished and incomplete for what I expect to see from Roblox. I personally think that the color scheme of the UI should be consistent with the color scheme of the chat, new BubbleChat, and TopBar. The microphone and camera icon floating above the character’s head is very intrusive, distracting, and takes away from the actual experience.

The concept that @CSJeverything had created is honestly more what I expected to see in the actual feature, but I would prefer it to have a light-on-dark color scheme as it is consistent with the Chat and TopBar UI.

I don’t really see a reason why these controls couldn’t be consolidated into the TopBar. In my opinion, that seems like a better use of that space than the “BETA” button that is currently there.

This feedback may sound a little harsh but I have tried to keep my criticisms as constructive as possible. I really hope that Roblox engineers can take this feedback into consideration and I look forward to adding new forms of communication to my creations on Roblox.

8 Likes

I think that the dots in the privacy settings tab are very unclean. Someone may think that the dots are next to Roblox icon on the taskbar. Also, someone may think these dots are only visible in the settings tab.

Already exists:
image

My solution:

image

The solution is way much better. The user can understand that the dots are part of the in-game GUI.

4 Likes

Because there is no top bar.

Roblox removed it to make room for the game’s custom GUI to have more space. While this unlocks opportunities for developers to use 35 pixels of free Y space, it is not futureproof and prevents Roblox from adding more buttons to the top bar (to prevent interfering with other experiences).

2 Likes

Hello, just a quick update: the version (589) which has a fix for this issue has been released on iOS, Android, Windows and Mac meanwhile.

6 Likes

this could be a useful for future games or simulators if this featurre gets accepted and adopted for roblox.

3 Likes

Hello, are you still having this issue? One thing you could double check if you have multiple cameras is whether it is using a different one that is not pointed at your face. You can find camera settings under the main experience settings menu.

3 Likes

Yes, I am still actively encountering this issue. I have checked the client settings to confirm that the camera I want to use is the selected camera. The blue light on my camera lights up indicating that the client is accessing the camera.

I have also tried other steps I listed on a separate post to try and get it working, but there has been no success.

4 Likes

Ok, thank you for the added context. Ugilicious is taking the lead on this and I see has already replied on your other post. We can continue the conversation there. Thank you for your help and patience!

4 Likes

I’ve had some time to get a feel for this system, and honestly, I don’t think it feels very expressive.
I used to stream as a V-Tuber as a hobby back in 2020, so I’ve used camera-based facial and hand-tracking before (although I’ve forgotten which software I used).
One of the things about the software I used, though, was that it wasn’t a one-to-one mirror of my real face. Instead, the model had basic emotion ‘poses’ and I would calibrate it so that my real-life expression would map to a pose on the model. Things like blinking and head position were always live (but again, I could calibrate it - something this system lacks but very much needs if I’m going to use it more often).
The dynamic heads lack a properly neutral expression as a default (except in R6), and honestly, it feels weird to me. It’s also very difficult to get a model like :] to display a frown, for somewhat obvious reasons. I think introducing a pose system - like I described before - would be much more expressive, as well as fit in better with the rest of Roblox. At the moment, the facial tracking shows a literal representation of the player’s face in real life. I think Roblox’s art style lends itself better to a metaphorical representation.
tl;dr use the face tracking to detect emotions and display that instead, add calibration

6 Likes

It’s been over two weeks and my main account has still not received this feature. I am over 13 and have been age verified since the process was introduced. Additionally, I have not received any form of justified moderation in at least two years. How long do I have to wait for this to roll out to me? Seems as though everyone else has been given access and I still don’t see the green dot under the Privacy Tab in Settings.

6 Likes

Rollouts usually take about a month from my experience. It’s random from what I’ve heard.

2 Likes

It could be very successful in the future, but it can also carry risks of being used for malicious intentions.
So though it’s good to animate avatars with your movements, there are of course players who will abuse this feature.

But I do say it’s a great feature if used correctly in the future.

3 Likes

This doesn’t seem to be true, I can’t get facial animation working in Studio. (Which is making debugging very difficult as I can’t observe the DataModel.)

It would be really helpful if this stuff was easier to access in Studio as well. Team Test kinda sucks because it forces you to commit your changes and save the game. I just want to be able to check how facial animation behaves in the DataModel, what properties it modifies, whether a change I made was compatible, etc. etc. while still doing things efficiently in a way that is temporary.

2 Likes

Sorry for the inconvenience, we’re currently investigating an issue in Studio, once resolved facial animation from camera input in play mode will be enabled in Studio again.

In the meantime regarding how facial animation behaves you can add a dynamic head avatar to your workspace, uncollapse the head item in it and there you’ll see it has a FaceControls item in it.

When you click that and look at the properties you can see it has a bunch of FACS properties there for which you can adjust the values between 0 and 1 and you can see the expression of the avatar’s face changing accordingly.

So that’s how the facial animation works, by adjusting the values of those FACS properties via animation input.

You can also open the animation editor and select the dynamic head avatar in the workspace, then create a new clip and then in the animation editor click the Face Editor Button to adjust facs properties for the face there or press the Face Capture button to create an animation from camera tracking input.

How Self View works: It looks for an item in the avatar with FaceControls or as fallback for an item named “Head” which should be a MeshPart or Part. Then it focusses that in front view.

Some further reading and info:

Hope it helps =)

3 Likes

Great thanks :slight_smile:

Thanks for the explanation, but those properties are not what I was talking about.

Rather, I am talking about details of how the Facial Animation system affects the character model, rig, and what things it relies on in the instance hierarchy to work.

For example:

This kind of information is critical in any game that does non-trivial things with the character model, and Studio provides tools to assist in figuring these things out. Which is why not being able to test in Play Solo / Local Servers is very bothersome. Team Test is workable, but it’s higher friction (you need to commit all script changes) and has less functionality (e.g. you can’t test with multiple players on a single device like you can with Local Servers).

6 Likes