I have a dictionary called data filled with different joints of the human body, and their position in 3D space based off of this documentation:
POSE_WORLD_LANDMARKS
Another list of pose landmarks in world coordinates. Each landmark consists of the following:
x , y and z : Real-world 3D coordinates in meters with the origin at the center between hips.
As you can see in the image above, this actually works, but it’s inaccurate. For example, when I reach out in real life, it faces a different direction in my avatar. Here’s how I did it:
Can I use VRService to control all body like with VRService:RequestNavigation()? If not, is there another strategy I can use?
Thank you for your interest. I’ll let you know when it does! This is a fun project. I can high five my friends, but I have to twist my arms in real life.
Next up: Legs, knee joints and full three-dimensional head movements.
Because I want to move on to something new, I might end it at 3D head movement, but one-dimensional full body movement. However, the project is still open source so anyone can contribute and help too! Lua and Python programmers needed!
Me pointing at my friend who came in first place in a race. His head looking towards me, while he stretches his legs, and poses towards the audience.
Summary: Arms, elbows, hips and knees can move in one dimension, while the neck can turn 180 degrees facing left to right.
I’m going to update the GitHub repository so that there’s instructions on how to use this. I’m not going to work on this anymore because I’m moving on.
I’ll let everyone know how soon because I need to create a manual on the GitHub page, including some performance improvements. Thank you for your interest!
Good news! Turns out I was definitely not done, and added a lot more features like hips rotations, and access control. I’ll create setup instructions on the GitHub page soon. Here’s a teaser :