Now that avatar bodies have been released it’ll be hard to decide whether to create an avatar with an open palm hand, or a closed fist. A closed fist is good for in-experience scenarios in which the player character is holding an item, but an open hand is a more natural pose.
Existing tech, Dynamic Heads -
With the relatively recent release of dynamic heads, it would be really cool to see the same tech be used to create dynamic hands that allow developers more control over a characters hands using float values.
How? -
Fifteen values per hand: three values for each of the five digits respectively. -
One for adjusting the side to side movement of the metacarpophalangeal joint.
One for back and forth movement of the metacarpophalangeal joint.
One for the curl created by the proximal interphalangeal joint and distal interphalangeal joint.
Why? -
Not only will this give developers control over grip for in-game items, but it’ll also open up possibilities for more expressive emotes and animations.
Potential problems! -
It’s likely exploiters, or bad faith actors will create gestures that are used to insult, like flipping the bird, however, this could be avoided by having them work similarly to animations in which the gesture has to be uploaded and created by the place owner or Roblox.
Yeah, that’d work too! I’m really just hoping for something that would allow artists to upload avatars to the catalog with hands that developers can animate.
Can we also get an extra joint or two in the feet? Thinking about the future of character customization and how much I’d love support for high heels but it isn’t really possible when the character’s feet are always flat. Also just being able to point the toes for dance animations…
Also edit as an afterthought: To prevent insulting gestures, you reduce the number of independent joints and lock finger movement together. Basically having all the fingers move at the same time so they cant separate into different shapes. The thumb could move independently though.
What about games with VR support could finally have rudimentary hand tracking
using the controllers this would be putting the control into the players hands.(quite literally)
Tho this wouldn’t be a problem for most controllers (like the quest 2 that doesn’t do finger tracking per finger tracking) that approximate finger tracking with 2 triggers.
This problem of players doing inappropriate hand gestures would only be a real problem on controllers similar to the knuckles controller.
I am happy to stop this conversation after you reply.(it is bloating the read time needlessly. I just checked it is half of the replies!!)
I think more joints for new avatars should be considered. A dynamic hand instance, similar to how dynamic faces work, would make this very easy for developers. Also, the new UGC avatars could use this effectively; or the ability to have custom bone structures entirely, such as the pseudo-R6 characters with less joints, or characters with digitigrade legs, or any other exotic anatomy.
Native VR integration, similar to how we can currently use the camera to move our faces, is another good point.
Avatar emotes, like the point and wave, would benefit from animated fingers, or at the very least a set of pre-defined poses - imagine if the “Point” emote in the avatar shop worked with EVERY UGC avatar, no matter how their limbs are oriented, or how many fingers they have, or how many bones are in their legs.
A generalized avatar system that supports more than the R15 joints would be great for this. Especially if it has the ability to make some part of the hierachy optional for a given character. One simple extension would be e.g. a “R17” system that has one additional joint inside each foot; existing avatar animations can be updated to animate this joint, and every existing R15-only avatar would simply not have (and therefore not animate) these joints.