How can I make Half Life 2 Lip Sync?
I don’t want to make millions of animations just for the mouth. I want to add a lot of Voicelines, that’s why I am asking.
Check if the audio is loud and then control the facial animation as you want. You can find many tutorials on AudioListeners and facial animation is easy as it has values in the “FaceControls” instance you can edit.
Yeah well, but how could I make it not look weird, opening and closing the mouth is easy to do, but I mean like, how I can make it look like realistic mouth motion.
if you watch the hl2 documentary the amount of effort that went into facial animations is so incredibly detailed it’s probably not in the realm of possibility for roblox. I also believe that game has premade animations for audio tracks because it would not look nearly as good if they animated based off of audio. You can record audio and video with your webcam and import it into a face for the animation though, I believe
Don’t say it’s not possible, were not dealing with an internal pipeline issue here (which only Roblox can improve), we can script lip-sync like hl2 ourselves. It’ll just take quite the bit of time
Quite honestly, I totally forgot about this post, but the audio thing can be replaced by just giving a string to the thing that handles the Lip Sync and then apply it based on the emotional state of that string/sentence (not sure if this makes a lot of sense when reading).
I mean, sure, but how much time do you have to perfect lip syncing to that level when you’re trying to ship a game? There’s not an entire team of skilled engineers behind most roblox games like valve has.
We already have the correct methodology to use from half-life. It took valve a very long time and required some very talented engineers to develop lag-compensation. Now every game (outside of Roblox) uses it, and even I myself can implement it given some features (control over the networking of characters, something that chickynoid offers, but not base Roblox humanoids)