Cool! Really exciting if Roblox can do that before DeepMotion.
Either way, I’d use the Roblox version though, due to its official-ness
Cool! Really exciting if Roblox can do that before DeepMotion.
Either way, I’d use the Roblox version though, due to its official-ness
They made their own AI model, I assume. They’re working on a similar feature for facial animation and gone in-depth on how they developed the AI themselves for that, I assume it’s the same here. Roblox is a powerful company and they’ve got a team of people on AI, why wouldn’t they be able to do it themselves?
This is going to help so many solo developers that dont have the time to animate or cant make good ones, great addition!
They bought Loom.AI, clearly they couldn’t do it themselves.
Anyway is there a blog post or anything where they go “in-depth” about it?
this is top tier way to use the new update
beat saber
i also wasnt actually playing. i was just pretending to play so that the ai can see my head and my hand movements so that the controllers and the headset wont mess the tracking (i do actually have one)
This is an absolutely astonishing feature, like a dream coming alive!
It doesn’t matter? If the animation turns out well, who cares how it was made. You pay for the product, not the process.
Oh my god! This is really cool! Works buggy but still this is impressive.
Look what I created with it: Test dance
yes this is the peak of the update
Hey everyone,
Glad to see you trying out this new feature! We are loving each and every one of your videos that you’ve shared with us! That said, we noticed some of you have questions and are curious about a few things regarding this feature. Hopefully this FAQ provides some clarity.
Why is the animation flipped?
Where are my videos processed and stored?
Why is the video length limited to 15s?
Is there an easy way to further edit the animation with so many keyframes generated?
Are other rigs like R6 supported?
Why does my animation not look good?
Why is my video stuck?
If you have additional questions, please ask away!
That is a relief I was genuinely worried about how these are processed.
It would be nice if these were processed like more complex ones that predict what easing style to make the keyframes and reduce the keyframe count.
One problem is that the animation editor is missing many easing styles and also the current output animations are super messy and are near impossible to edit. Also not very helpful that all the keyframes blend together in big counts.
The video import is fine and nice that it is getting improvements but there are many flaws of it. I just think it would be nice to import from video and be able to polish it manually similar to many other programs that do something similar.
Can’t you make the conversion on the device, not on server.
I think it runs on their servers so when they make an adjustment to the system they don’t always have to push another update to studio. They just have to change how their servers handle the input.
Not only that, it could also be that lower end devices would suffer a lot from the amount of ram this could be taking up combined with other apps you may have open at the same time, such as chrome. Potentially causing studio or the device to freeze/lag momentarily while processing.
But that’s really for Roblox staff to answer, I’m purely guessing.
After seeing the Face Recorder feature, I notice it uses live tracking. Will body tracking also use live tracking in future or will it always be just having to upload a 15 second video? Great feature though! Keep it coming!
Hey everyone,
I’ve created a YouTube video featuring Live Animation Creator. It was a lot of fun and includes several examples. If you’d like to watch, here it is: LIVE ANIMATION CREATOR in Roblox Studio (New Feature) - YouTube
Thanks
This looks really good! I haven’t tried it yet but when I get the chance I will!