Does anyone under stand how to replicate this?
Before i try and describe how this works, i want to make it clear that i really, really, really, really, REALLY do not recommend you use this in a actual game, its not good whatsoever and roblox animations are not fit for this sort of application. In saying that, this is how the system works:
Motion matching works as described by the youtube videos by taking huge amounts of motion capture data, and keeping track of the movements of different key parts on the rig, along with the direction that the frame is moving in. For example, a piece of motion capture data where a human is walking forward could track the movement of each limb, and also the roots trajectory. Take all this information and store it somewhere for later.
Now, next is to take the input direction of the character + the characters current animation frame and find the closest match from the data you collected previously. If your character starts out in an idle frame, and you start moving forward, the closest match would be a piece of the dataset where the player is taking their first step, and so would match the motion (get it?) and play that specific animation frame next. This process repeats for each frame that the player is on and their input trajectory, and then you end up with a (somewhat terrible-looking) motion matching system.
If your question now is “how do i get loads of motion capture data so i can try this myself?”, all i can say is good luck, because its very hard to get enough data without going out and renting a mocap suit yourself. happy googling!
The dataset i used myself was ripped from obscure motion capture websites, tediously stitched together into a single continuous animation. I genuinely couldn’t find these sites now if i tried, nor do i even know if they still exist. I then had to retarget them to my character rig using two different versions of Unreal Engine, and i wouldn’t wish this fate on my worst enemy!
Either way, i recommend everyone reading this to not bother at all with a system like this for roblox, and just use simple state machines cause you can get a very similar result in the fraction of the time, using a motion capture library such as Mixamo which provides all the animations you would need for it. (And nah, mixamo doesn’t really work for something like this as the dataset used needs to be continuous motion capture data, with best results recorded using a specific “dance card” as they call it in the youtube videos.)
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.