Thank you all, your feedback has been noted. It seems there are a few clear areas where you’d like improvements. Keep them coming!
I have a VR headset. It would be awesome to be able to use its motion tracking in combination with IK to use motion capture to record my own animations. That would save a ton of money for a full motion capture suit (especially with the low price of the Quest), and remove a lot of tedium when it comes to making animations as a non-animator. It would be good to have support for multiple trackers like the 3rd-party full-body tracking (slime VR) or attaching it to objects to animate (e.g. Vive trackers).
also, stop working after work hours
How are you suggesting they accomplish this? For example are you saying that the headset would act as the head to a r15 character, and the arms would move with IK in accordance to hand tracking position?
Let me make it clear, I’m not against this idea, I actually support it, but there are quite a few hurdles that need to be jumped before this could ever be a feature. For example, motion capture tech almost ALWAYS has to be cleaned up after the fact as the rough data gained from even the most state-of-the-art MoCap suits is still relatively crude. The problem with this is, if Roblox implemented this feature, there is little to no chance that you would be able to alter this animation yourself after capture (See the video to animation beta), IMO that would render this borderline useless.
Not to mention with just a headset and controllers, the motion that would be recorded would be relatively basic, and only apply on a couple parts of a rig, making it look quite strange. The support of 3rd party full-body tracking could exponentially help this out, but the fact of the matter is, I’m willing to bet that < 1% of Roblox Developers are more likely to either own vive trackers or set up a recording workspace to begin with. This is because in the day, it’s cheaper and easier for developers to either animate themselves, or hire a talented animator to make these simple animations an easy task to complete.
But honestly, it kinda seems like Roblox to develop this random gimmick feature that hardly anyone would be able to use lmao, that’d be hilarious.
(For reference I do own these methods, it’s just I don’t think enough OTHER people have access to these devices in order to make this a worthwhile investment vs. the ease of just making an animation yourself.)
Love and agree with the suggestions people have already made! A couple of extra I would love to see are:
-
Official R15 rigs for all body types OR a way to more easily export rigs from Roblox to enable R15 animation in external programs like Blender/Maya. Currently I’m only aware of the mannequin rig, which we used to manually create R15 rigs for other body types, but it’s a big hassle currently.
-
Ability to keyframe all controls in the animation editor. Perhaps if you have multiple controls selected and press “…” → “Add keyframe” it could keyframe everything you have selected.
-
Enable copy/paste of keyframes between different controls. If I have animated one arm and want the other to do the same exact movement I would like to be able to copy the values from one to the other.
-
Add functionality for weighted tangents.
Thank you for the opportunity to give suggestions (and please let me know if any of these are already possible and I’ve just managed to miss them )
Please fix animation events. Currently you have to call :disconnect every time a animation event is fired and its frankly annoying. It would also be nice to have all the easingstyles in the animator.
There are 2 huge longstanding issues with animations right now.
-
Sharing Animations across projects is extremely tedious due to them being locked per person/group. There’s hardly a need to have Animations have the privacy settings they do, they end up causing way more issues of wasting time.
There’s an entire project dedicated to moving your animations across different groups you own from how much of a problem this is:
GitHub - evaera/roblox-animation-transfer: Transfers Roblox animations from one owner to another.
There’s feature requests about this issue for almost a decade now, so it would be really amazing if we could remove the place restrictions for Animations and make them usable in any experience.
Here are some long standing feature requests asking for this:
Allow us to make our animations usable in any game by configuring it and add a library section
Public Animations
Unlock Animations!
We NEED the ability to control who has access to private assets -
There’s a huge amount of time wasted having to publish animations every time you make a change to test if they work. Luckily there’s an API to load keyframes into animations, but this only is allowed in studio and not in live games. You still have to publish all your changes into Animation Instances at the end of the day. So you’re forced to constantly juggle keyframes and animations. And since the Animation overwrite UI in studio is really not good at showing all your animations conveniently, it’s far easier to just publish new animations instead of bothering trying to overwrite originals.
That’s all to say, it would be really nice if we could just turn keyframes into Animations in live games as well. I know that animation keyframes can take a lot of memory, and compressing them from 100’s of instances to just raw memory is way more efficient. But something like this could also just as easily be baked in studio as its own data model.
Anyways, this was a huge wall of text, thank you for hearing us out,
Regards,
Aaron Mccoy
Hi, could you please explain what you mean by this a bit more? Thanks
You mean that if you publish a modified animation with the same ID, existing players in a server won’t see the updated version until they or the server restart?
Yea, I’m fairly sure that’s the case but its been quite a while since I’ve tested it.
I tend to avoid updating animations in favor of making new ones too since I have to shutdown my game to work in game logic for the change in animation. It is fairly rare in most of my animation use cases that I need to publish a non-critical change to any animation already live in game. It’s still not ideal to have visual discrepancies for players in those rare circumstances though.
Ah, my bad. Basically I was talking about rFrameAnimator’s graph system which helps us to customize and take our tweens to the next level. Here’s a pic:
(Sorry for the lack of clarification before)
Lets see…
The program I use for animation is Cinema 4D, what I like is you can animate any 3D object and rigs are only needed when something bends. Setting up rigs on roblox is… tedious. Many a rig I have had to abandon. Just the ability to move any object without a rig or motor 6d would be super helpful.
Also humanoids are laggy in bulk.
Also I really dislike the auto keyframes.
(maybe skinweighting would be cool too but thats low priority)
Also a way to change the pivot thingy to object pivot or world pivot:
THANKS SO MUCH!
It would be nice “preview” or to see my IKConstraints
active while using the Animation Editor
. This would make animating custom characters with Inverse Kinematics a lot easier without the use of external programs like Blender.
Current Animation Editor
With Constraint Preview:
Obviously not a perfect example, but a feature like this would save a LOT of headaches when using the IKConstraint
in animations.
OH WAIT, I FORGOT THE MOST IMPORTANT THING!
I don’t care how basic or rudimentary it is, please please please add the ability to line up a sound with an animation, animating to dialog and music is SO hard!
Mostly because this:
Fun fact: the Quest 2 has consistently outsold the Xbox series X | S, and is not far behind the PS5. VR isn’t cheap, but as of the time of that article, there were 30% more total Quest 2s in circulation than Xbox series X | S. VR is is a luxury, but it’s actually more accessible than you might think.
I don’t expect everyone to go buy Vive trackers, but even without them VR mocap is still pretty useful – I’ve used third-party tooling to record and import it into Roblox before (it’s just a huge pain to do because of all of the tooling involved). You can definitely clean up mocapped animations by decimating unneeded frames (which also has the benefit of making them look more game-y instead of mocapped) through plugins, but one of the things I’ve learned a great appreciation for is efficient iteration early in development. The animations don’t need to be perfect for that – I just need something that let’s me create a “good enough” animation rapidly that I can use to get a good “feel” for the game I’m working on, which can later be replaced by a professionally made animation (potentially after I’ve used my fleshed out placeholders to garner funding). Being able to shave off hours of early prototyping by just natively acting out a bunch of animations in VR with my hands rather than fighting flatscreen tooling is something I’d pay good money for alone.
The source you have provided is listing sales estimates from 2021. While estimates are good to get a general idea, it’s always better to source sales data from the product manufacturer rather than third party analysts; not to mention It’s better to find more recent data.
According to a Microsoft presentation, the Series X/S have sold around 21 million devices by June of 2023. (Source)
According to Playstation, the PS5 has sold around 50 million units by December of 2023. (Source)
A leaked slide from Meta’s Reality Labs Roadmap Presentation indicates that Meta has sold around 20 Million Units as of February 2023 (Source)
This statement does not seem accurate.
This isn’t to say that you are incorrect. VR is more accessible than ever before. Meta has introduced tens of millions of people to VR.
I believe you misinterpret my statement here, I wasn’t saying that a large amount of people don’t own VR headsets.
However, just because 20 million people got quest headsets in recent years, does not guarantee that those people play Roblox.
In my opinion, a feature like this and the technical work required to push out of development would necessitate a LARGE amount of people both supporting it and finding use in it. I was merely saying that this feature does not impact enough people for it to matter to Roblox.
In August of 2023 CEO of Roblox reported a download count of ~ 1,000,000 for Roblox on the quest (Source)
However, this statistic does not account for downloads on multiple accounts, or resets; which tend to put the count lower than this number). It is likely though that over the past 273 days since the metric was reported, downloads have risen quite a bit. Factoring this in you can have a decent idea to the amount of Roblox VR players there are.
However, not all Roblox players are developers.
As of September 2023, Roblox has reported that there are around 3 million developers on the platform. (Source)
Even if you make the bold assumption that Roblox VR players are just as likely to be developers as any other (not to mention also owning a desktop computer), only around 5% of Roblox developers would have access to this.
It’s just not enough of a community for Roblox to care in the first place.
With three points of tracking only so much motion can be gathered. I will revisit this topic in my next point.
I don’t get this, if you’re doing it for prototyping reasons, I feel as though it would be both easier and faster to just put down a couple basic keyframes to get a feel for it. As opposed to connecting your VR to a computer, putting it on, hitting record, and making a motion.
If you’re going to decimate the animation to make it look more game-y why not just animate it yourself in the first place. What you’re doing is essentially taking the expressive detail (which is an extremely large draw of motion capture) and simplifying it to what could be easily animated in the first place.
A couple sentences ago I mentioned that an extremely large reason for mocap is it allows for realistic, fluid movement, with a monumental amount of data. To SAVE TIME over keyframing such animations.
The way it seems you intend to use it inevitably betrays the reason for motion capture in the first place. You’re taking the animation and removing some of the biggest reasons FOR mocap.
In the end, I like the idea. However, I believe that it is largely unnecessary.
How you have described using it seems like it would only be useful for developers that own a vr headset, are not animators or do not have someone who can animate for them and wants to push things out quickly. It just seems unnecessary to me, but maybe I’m wrong.
Since nobody has mentioned it yet…
Why not add the ability to use exported animation data from MikuMikuDance (MMD), for use in ROBLOX animation data? MikuMikuDance (MMD), from what I’ve heard sounds like an easy animation tool, and so having the ability to import .VMD (Vocaloid Motion Data) files into ROBLOX sounds like a good idea, but it might be niche.
Sure, the bones in a rig for the .VMD file have to be named a certain way in order for it to work with an R15 rig, but it might sound like a good idea…
- a Simple Feature like a Parent Mesh, to control the position of any character animation.
- an option that allows you to play any custom character animation in game play.
issue explained right here:
How to play an Animation in Game Play? - Help and Feedback / Building Support - Developer Forum | Roblox
Also sharing ids should be more intuitive. Anybody should be able to use them and they should be easier to edit instead of making a new one. I just uploaded a bunch individually and it took way too long.
There’s so many moments of me wanting to headbutt the wall just because how limited the blending of the animations is in general. We REALLY need additive animations.
Bulk exporting animations would also be a huge time saver.
The “Animation” object seems pretty useless why can’t we just LoadAnimation through the Id?
Also please fix cubic easing direction in the animator not matching easing direction in-game (if you animate with “Out” it plays as “In” during run time and vice versa).
I strongly support additive animations. Every time I work with animations, the absence of this feature makes things significantly harder. Imagine being able to play an idle animation on top of every other animation. You would save so much time because most of your animations would only need one keyframe.
And add more animation priorities!!