I think that the dots in the privacy settings tab are very unclean. Someone may think that the dots are next to Roblox icon on the taskbar. Also, someone may think these dots are only visible in the settings tab.
Already exists:
My solution:
The solution is way much better. The user can understand that the dots are part of the in-game GUI.
Roblox removed it to make room for the game’s custom GUI to have more space. While this unlocks opportunities for developers to use 35 pixels of free Y space, it is not futureproof and prevents Roblox from adding more buttons to the top bar (to prevent interfering with other experiences).
Hello, are you still having this issue? One thing you could double check if you have multiple cameras is whether it is using a different one that is not pointed at your face. You can find camera settings under the main experience settings menu.
Yes, I am still actively encountering this issue. I have checked the client settings to confirm that the camera I want to use is the selected camera. The blue light on my camera lights up indicating that the client is accessing the camera.
I have also tried other steps I listed on a separate post to try and get it working, but there has been no success.
Ok, thank you for the added context. Ugilicious is taking the lead on this and I see has already replied on your other post. We can continue the conversation there. Thank you for your help and patience!
I’ve had some time to get a feel for this system, and honestly, I don’t think it feels very expressive. I used to stream as a V-Tuber as a hobby back in 2020, so I’ve used camera-based facial and hand-tracking before (although I’ve forgotten which software I used). One of the things about the software I used, though, was that it wasn’t a one-to-one mirror of my real face. Instead, the model had basic emotion ‘poses’ and I would calibrate it so that my real-life expression would map to a pose on the model. Things like blinking and head position were always live (but again, I could calibrate it - something this system lacks but very much needs if I’m going to use it more often). The dynamic heads lack a properly neutral expression as a default (except in R6), and honestly, it feels weird to me. It’s also very difficult to get a model like :] to display a frown, for somewhat obvious reasons. I think introducing a pose system - like I described before - would be much more expressive, as well as fit in better with the rest of Roblox. At the moment, the facial tracking shows a literal representation of the player’s face in real life. I think Roblox’s art style lends itself better to a metaphorical representation. tl;dr use the face tracking to detect emotions and display that instead, add calibration
It’s been over two weeks and my main account has still not received this feature. I am over 13 and have been age verified since the process was introduced. Additionally, I have not received any form of justified moderation in at least two years. How long do I have to wait for this to roll out to me? Seems as though everyone else has been given access and I still don’t see the green dot under the Privacy Tab in Settings.
It could be very successful in the future, but it can also carry risks of being used for malicious intentions.
So though it’s good to animate avatars with your movements, there are of course players who will abuse this feature.
But I do say it’s a great feature if used correctly in the future.
This doesn’t seem to be true, I can’t get facial animation working in Studio. (Which is making debugging very difficult as I can’t observe the DataModel.)
It would be really helpful if this stuff was easier to access in Studio as well. Team Test kinda sucks because it forces you to commit your changes and save the game. I just want to be able to check how facial animation behaves in the DataModel, what properties it modifies, whether a change I made was compatible, etc. etc. while still doing things efficiently in a way that is temporary.
Sorry for the inconvenience, we’re currently investigating an issue in Studio, once resolved facial animation from camera input in play mode will be enabled in Studio again.
In the meantime regarding how facial animation behaves you can add a dynamic head avatar to your workspace, uncollapse the head item in it and there you’ll see it has a FaceControls item in it.
When you click that and look at the properties you can see it has a bunch of FACS properties there for which you can adjust the values between 0 and 1 and you can see the expression of the avatar’s face changing accordingly.
So that’s how the facial animation works, by adjusting the values of those FACS properties via animation input.
You can also open the animation editor and select the dynamic head avatar in the workspace, then create a new clip and then in the animation editor click the Face Editor Button to adjust facs properties for the face there or press the Face Capture button to create an animation from camera tracking input.
How Self View works: It looks for an item in the avatar with FaceControls or as fallback for an item named “Head” which should be a MeshPart or Part. Then it focusses that in front view.
Thanks for the explanation, but those properties are not what I was talking about.
Rather, I am talking about details of how the Facial Animation system affects the character model, rig, and what things it relies on in the instance hierarchy to work.
For example:
Facial Animation sets the Neck joint’s Transform property
Facial Animation stops working if the Neck joint is disabled
This kind of information is critical in any game that does non-trivial things with the character model, and Studio provides tools to assist in figuring these things out. Which is why not being able to test in Play Solo / Local Servers is very bothersome. Team Test is workable, but it’s higher friction (you need to commit all script changes) and has less functionality (e.g. you can’t test with multiple players on a single device like you can with Local Servers).
Thanks for the detail, yeah, i totally see that would be very useful to check during play mode in studio, we’ll get that going again asap.
Regarding your points list:
-Yes, when animation from camera input is enabled it looks for a “Neck” joint and alters the cframe orientation of it to make the avatar’s head turn according to how one turns one’s head.
-When using mic only and no camera one only gets mouth animation lipsync, so then no head rotation applied.
-Facial animation from camera input should still work when the neck gets disabled
-thanks for reporting that issue, we’re looking into it.
hello, just a quick update: the related issue in Studio has been fixed and we enabled camera/ mic usage in studio play mode again.
To use it you can go to Game Settings/Communication and enable Camera/ Microphone there, save/publish to Roblox, restart Studio and then you should be able to use animation from Camera in Play Solo, Local Server and Team Create Team Test and Mic in Team Create Team Test by using the bubble chat mic/cam buttons.
Thanks for giving it a try, glad to hear =)
And yeah, was beautiful to see our team pushing together to fix it and test and roll out the fix so smoothly, love it when that happens, makes me feel like part of a well coordinated orchestra =)
All our teams are pushing on so many amazing things to make Roblox that bit better every day.
I know like everyone we have our ups and downs, sometimes some things are not ideal right away, sometimes some need fixing, sometimes slight adjusting and sometimes bigger course correction, but yeah, when it all sings in harmony, it can be magical =)