Half Life 2 Lip Sync

How can I make Half Life 2 Lip Sync?
I don’t want to make millions of animations just for the mouth. I want to add a lot of Voicelines, that’s why I am asking.

I can only think about achieving it using AudioListener | Documentation - Roblox Creator Hub and Facial Animation | Documentation - Roblox Creator Hub in some sort of way.

Check if the audio is loud and then control the facial animation as you want. You can find many tutorials on AudioListeners and facial animation is easy as it has values in the “FaceControls” instance you can edit.

1 Like

Yeah well, but how could I make it not look weird, opening and closing the mouth is easy to do, but I mean like, how I can make it look like realistic mouth motion.

I think the only solution would be animations…

1 Like

Boooooo, why does everything have to be so complicated, an engine that isn’t even being updated has that feature, why can’t Roblox have it

1 Like

I’ve been trying to find a script that makes lip sync just like half life 2 for a whole week😭

if you watch the hl2 documentary the amount of effort that went into facial animations is so incredibly detailed it’s probably not in the realm of possibility for roblox. I also believe that game has premade animations for audio tracks because it would not look nearly as good if they animated based off of audio. You can record audio and video with your webcam and import it into a face for the animation though, I believe

1 Like

true, in the Half Life 2 commentary the guy who scripted the eyes said how much of a pain it was lol
thanks alot, i’ll try that!

max actually managed to do half life lypsyncing using a studio only beta
so it is seemingly possible

2 Likes

Don’t say it’s not possible, were not dealing with an internal pipeline issue here (which only Roblox can improve), we can script lip-sync like hl2 ourselves. It’ll just take quite the bit of time

2 Likes

Quite honestly, I totally forgot about this post, but the audio thing can be replaced by just giving a string to the thing that handles the Lip Sync and then apply it based on the emotional state of that string/sentence (not sure if this makes a lot of sense when reading).

I mean, sure, but how much time do you have to perfect lip syncing to that level when you’re trying to ship a game? There’s not an entire team of skilled engineers behind most roblox games like valve has.

We already have the correct methodology to use from half-life. It took valve a very long time and required some very talented engineers to develop lag-compensation. Now every game (outside of Roblox) uses it, and even I myself can implement it given some features (control over the networking of characters, something that chickynoid offers, but not base Roblox humanoids)

2 Likes

It could be done as a module once and published to the forum, I was thinking about doing that, or rather updating and fixing my module

I can definitely see it happening if someone polished up a library for exactly this use case

2 Likes