Nightcycle Studios 12/3/2020 "Weekly" Update

SHL4 has begun testing!

Currently, the only way to become a tester is to join the discord linked to all my games - but don’t worry if you can’t/don’t want to do that, in all honesty you’re not missing much. I’m not saying this to be humble - the game at this time is just running around as a character in a 1 player server. It is very much a tech demo, a successful one at that, but there’s just nothing to do gameplay wise currently. If you’re still curious though, there are a few great youtube videos from community influencers about the testing, so feel free to check out those for some more gameplay!

But yeah, it’s really cool to finally have a larger audience interact with it, with only ~800 people with access the game/tech demo has already been played 1150 times! There are certainly some bugs to fix, but we’re blown away by how supportive people have been through it all. It takes me back to the positive responses when SHL2 went into the community testing stage in April 2017, except at around 10x the magnitude. It’s all quite exciting!

What’ve I been up to?
But let’s slow down for a moment and reflect in a bit more detail on what we’ve been doing for the past (checks logs) 3 weeks. I promised in the minor post a week ago you had a full update coming, and I’m here to deliver that. I’ll also include some fun character designs that have been generated along the way.

What I hoped to accomplish in 1 week:

  • Cape physics
  • Character scaling
  • Avatar control improvements
  • Mobile porting
  • Fire person mode
  • Faces with dynamic expressions

What I accomplished in 3 weeks:

  • Cape physics
  • Avatar control improvements
  • Mobile porting
  • Faces with dynamic expressions
  • Released the game to testers

So, it appears I suck at estimating workloads. We’ll get more into that in the next post. For now though let’s break down each completed task.

Cape Physics
This was miserable haha, and what’s worse is that looking back I could probably make a better version in 1-2 days as I’ve learned so much, but frankly I’m not sure it’s the best use of my time at this very moment.


So, here’s how the cape is rigged. The idea is that through adjusting the three major bone areas we could get dynamic control of the cape and accomplish most of what we wanted to do. The bones are slightly smaller at the top as it will have the most severe bending due to it projecting horizontally but gravity bending it downwards.

Half the struggle of setting this cape up was just the porting of it from blender to Roblox. I’m still learning blender, and up until now the most difficult thing I’d made was a hair strand - not exactly an expert level task. But I accidentally had some of the bones rotated in the wrong direction.

This wouldn’t be a problem if I adjusted the cape using an animation, as the offset would be included in whatever animation we created. Unfortunately I was instead adjusting the bones each frame using some CFrame math, and that meant a flipped bone would cause it to go the opposite direction.

Roblox currently lacks bone visualization tools, so in order to fix this problem I actually made one. I’ll be releasing it soon I promise - maybe even today. It’s free because honestly in my opinion it should be built into studio - I can’t imagine adjusting bones without it. There seems to be demand for it, as I posted a tweet of the gif above to my personal account and it got the more likes than any of the official Nightcycle posts had that month - even more than the hair physics one!

But yeah, it took a few days longer than I expected just to get the bones oriented right and responding correctly to the code. The next major step was getting it to respond to character movement.

The way I handled hair physics allowed for twisting of bones - it wasn’t a huge deal for the hair as each strand was so small and round it wasn’t truly noticeable. But for the cape these strands all operated on the same mesh, which meant when one twisted it was highly visible. I had to re-write how I was moving things.

With the hair I would set the cframes with CFrame.fromMatrix, which required 3 Vector3s. This was fine, but I always had trouble dynamically determining the final vector, hence things would rotate. Once again, as with the IK work, the solution came in CFrame.fromAxis, which allowed me to use two Vector3s, and an angle. If I set the angle to 0 then the cape wouldn’t rotate.

After I got that solved, I added some flutter to the cape at higher speeds through running the tick() through math.sin(), I added a delay to the tick by summing it with the bone’s distance down the cape. I also increased the amplitude as it got farther down + as the player got faster. The result is a flowing sin wave that ripples through the cape with relatively little math.

As time goes on, I’ll likely redo this method to take into account things like player torque, as well as more accurate simulations of air physics, but for the early stages I think it turned out pretty well! We posted this gif above to twitter and it became our most liked gif ever, flying past the bone demo from above, and narrowly edging out an old tech demo for Mortal Metal. Little did we know it would not hold this newly gained title for long.

Avatar Control Improvements and Mobile Porting
This is a short one and actually was the last thing we did chronologically, but it’s not as interesting as what’s next so we decided to not end with it. Basically before now I had been using ContextActionService to make my own keybinds for the player movement. I then would run it to the part of the code which handles movement and make the relevant adjustments.

When porting to mobile though, I realized “why reinvent the wheel” and started looking through the Roblox module used for mobile porting in most games. I realized that due to the modular nature of it I could pretty easily hijack the control module, allowing for me to hook it directly up to the movement in my custom avatar. By doing this I was able to not only instantly port the avatar to mobile devices, but also improve the PC port.

One problem I had been struggling with for a while was making the movement controls feel right. I had never done anything like this before, and so all my custom attempts felt sluggish and sometimes flat out wrong. By swapping in the roblox control module, I was able to bypass the need to understand this entirely. Thanks Roblox!

And more specifically, thank you @AllYourBlox who appears to have written the Roblox module. It was quick to understand, easy to edit, and frankly I should have used it from day one. Thanks for helping my game out!

Making a Dynamic Face
So, anyone who has been on Roblox for a while remembers the intensely negative reaction to the Anthro tech demo released back around 2017.

It was so disliked by the community that memes about it are made to this day. It is not to be confused with RThro, which is the next generation version of it that actually was released. To my knowledge Anthro was always a tech demo.

That being said, community support for RThro is split at best, with most of the people enjoying it seeming to be newer additions. If I had to guess they find the traditional Roblox look jarring and prefer something they might see in other games.

One reason I believe the RThro style was more popular than Anthro though was because they didn’t commit to a single look. From what I can tell their is no set style. The largest thing connecting them is a deviation from the traditional Roblox rigging standards. The problem is I wanted a consistent art style in my game. So that means I’d have to take on the daunting task of creating a new look for an audience that seemed to vomit when presented with any deviations from the standard decal face.

Unlike the team that made Anthro though, I had (arguably unearned) advantages they lacked. For one, due to advancements in the Roblox engine I could utilize mesh deformation, as well as make a custom surface appearance to provide a matte aesthetic to the skin. This should help avoid a plastic action figure effect that made the Anthro so uncanny. Secondly, I get to exist with both the hindsight of what they did, as well as less strict deadlines to get it done. That doesn’t mean it would be easy.

Now, back when we were first releasing demos for the LP3 (the name for our avatar style, standing for Low-Poly 3 as it was our third major attempt), we had to choose a basic face. The task of creating it along with most other LP3 assets came down to @Ryferion1, who did a great job, and has graciously allowed me to show some older prototypes of the face.

image

Initially we went with this for the debut of the avatar style. It was relatively bland, and as a bonus since it was only a small part of the picture. It also had the benefit of being rendered in Blender, baking for a few minutes to get extra soft lighting - something we would not have access to during a real-time Roblox game. Finally, the centered character’s face was mostly covered + had glowing eyes, so that further offset uncanny valley. The response was mostly focused on the suits, but nobody seemed to like the faces all too much. Because it was so abstract, people weren’t repulsed due to uncanny valley, but people still felt it was a bit off.

So, behind closed doors we started showing alternate eye concepts to a focus group of dedicated community members. They overwhelmingly voted for an eye concept that was much more realistic.

image

This was our first attempt to bring that face to life. Stylistically it was the previous face except without the mask, with some adjustments to things like the nose and mouth. We showed it to the focus group and they were lukewarm on it, making some suggestions about various proportions of the face. The problem is we couldn’t really get a consensus on what the problem was.

image

In an attempt to fix this, I photoshopped the face into 6 different variants for people to vote on. Faces F and B were clearly preferred by people, with D and E seeming too childlike, and C’s eyes being too big. A was the original and came almost in last. With that feedback, we adjusted the model and imported it into Roblox studio for the first time.

It was pretty creepy.

image

The soft shading of Blender was massively important to its appeal. The Roblox shading made it look waxy, and lifeless, which made all the lifelike parts of him amost morbid. We had suspected this was possible, which was why we imported it before rigging anything. But now we’d confirmed our fears, and we were stuck - the face just didn’t look good and we didn’t know what to do.

So, Ryf and I had a few meetings where we just brainstormed. Some were text only, some were via call, all were stressful. We ended up making the tough call to basically start over, throwing away quite a bit of work and testing feedback. We decided to re-embrace a much lower more polygonal approach to the faces.

Our first major inspiration was from this avatar style as shown above. We really liked how they accomplished the nose, as well as the usage of a more defined brow to make eye shading more impactful. It was still pretty high detail though, and we weren’t sure it would fit our avatar bodies.

Another style we took inspiration from was the Polygon line of avatars. We felt it worked great with flat shading, as well as their emphasis on larger single color eyes. Their low detail style was much more compatible with the existing avatar bodies we had.

By the end of it, we had created a new style, with the inspired more blocky nose and pronounced brow, combined with a low detail face and unicolor eyes.

We then imported it into studio, and breathed a sigh of relief as it wasn’t tripping the uncanny valley alarms in our heads. We had successfully abstracted far enough away from a person. But our work was not done yet.

image

Using photoshop I messed around with the proportions, added them to bodies, and provided my audience with what I felt to be a major improvement. So I had the focus group vote on a comparison image to see how much they liked the new version over the old!

image

eugh.

I should have been an architect.

They hated it. But after further discussions with the focus group, it became apparent that the reason they liked the other head more was because they really hated the trapezoid eyes of the new one. I liked the new eyes, but I wouldn’t be playing the game as much as they would so I put together some alternatives.

image

I photoshopped different shapes, as well as a cartoon variant, and than as a last minute addition threw on a photoshop of the eyes from the first version. As you can see, people really liked that last minute addition. To think, we were a last minute thought away from missing out on their favorite variant.

I also photoshopped some real eyes onto it to mess with Ryf.


Needless to say we did not show the focus group testers this variant.

But, we had the eye problem figured out thanks to that rapid prototyping, and so it all came down to whether this new version of the face could compete with the version from before.

And the answer, kind of! It wasn’t unanimous, which would have been great, however we felt it was enough of a clear improvement (especially considering the vote with the earlier eyes) that we were going to go ahead and make the changes.

image

And, we had our face!

Now, I just had to stitch it onto the existing body. That sounds pretty gross, but in all honesty some of the bugs I had to fix will haunt me for the rest of my life.

Starting off on a more light note, the first head was too small and it made me chuckle.
image

After re-scaling it (we’d actually have to rescale it once more after focus group testing), it was time to begin work on the real-time dynamic facial expression. Our first attempt at expressing emotion went like this:
image
I don’t know what emotion this is, but I hope I never feel it.

Also around this time I realized we never actually added ears to the model, so I had to go in and make those in blender. It took like 6 hours because as I mentioned I’m bad at blender. I also had to do a bunch of face rigging stuff, but luckily Ryf had already started that so I had some things to build off of.

After some more debugging I managed to get the eyes to follow the camera. Cotton Candy Logan Paul here actually gave me some hope - I was worried that it would be horrifying, but it wasn’t. While it still has ways to go, I realized that I wasn’t feeling some primal urge to flee when I gazed at it - something which we in the game art community call “a good sign”!

I joke to offset the upcoming horror:

So blinking was its own challenge. I was still learning how to weight paint and that caused a lot more of the face to move with the eyelids than was expected. It took a few more hours to figure this one out.
image
Coincidentally around the time I posted this, Ryf’s status changed. Read into that what you will.

But, after some more debugging I got it to blink like a human!

I felt this really brought the face to a whole new level - like I not only didn’t feel like running, but actually thought it was kind of cool (aka in game art terms “a great sign”).

The eyebrows and mouth required me to fix some rigging issues, but soon they were integrated and I had full emoting control over the face! I built up a library of expressions for it to cycle through and posted it to the focus group.
image
Unanimous support.

Finally.

So, I went and posted the gif to twitter. There weren’t any majorly critical replies to it, but I was nervous so I still made a semi-apologetic tweet saying it was still a work in progress.

Guess I was used to these face demos backfiring. I took one last look at the gif, and went to bed, hoping for the best.

image

The demo exploded

The tweet I’d made to the Nightcycle Studios account showing off the dynamic facial expressions went Roblox Twitter viral. Massive Roblox community figures replied to and retweeted it. We had done something which nobody had managed to do before - get the Roblox Twitter Community to approve of a humanoid Roblox avatar. Not only did the most difficult thing we had ever tried succeed, but it had become our greatest success.

Since first envisioning the LP3 in May 2020, Ryferion while balancing a full classload has spent almost 150 hours making and refining assets for it. As a full time developer I have spent almost 600 hours on the avatar physics, customization, real-time rig inverse kinematics, hair physics, cape physics, face and cape rigging, and of course the dynamic facial expressions. I completed the first Super Hero Life in around 300 hours.

These likes would not have been possible without every hour of work, and they certainly would not have been possible without Ryf’s amazing talent. I’ve never been more proud of a creation, but that doesn’t mean it’s perfect.

There’s of course room for more polishing, I can name at least a dozen things I hope to fix with the avatar before the game free releases next year. On top of that there were a few minor replies to the tweet saying they were unsure about it. If I had to guess they made up 4-5% of the audience? From what I can tell most of the people unsure about it more disliked the fact that Roblox is changing, rather than any specific implementation issue. And I know what it’s like when people dislike the implementation - it’s been my last month dealing with this face.

But thankfully it appears the intensive focus group testing helped us create something that almost everyone can get behind. Shout out to Professor Gray of my former Purdue User Experience Design class, you were completely right - focus group testing pays off. I wish it hadn’t taken me so long to listen to you. In fairness focus group testing is quite a miserable chore, but I’m beginning to realize how important it is. I’m making up for lost time now though, and because of this new tool in my developer belt Super Hero Life IV will be the best game I’ve ever made.

Releasing SHL4 0.1.0
So, after I fixed some bugs in the game pointed out by early testers, I added an intro gui to give people relevant playtesting info, and released it to the ~800 verified testers ready to play it.

So far, people seem to really like it, though the game as a whole is not without its issues. There are many concerns about hair & cape clipping, the camera feeling a bit floaty, some minor problems with the walk animation - etc. They’re things I plan on fixing, but most importantly they’re things which are clearly fixable. There are no major catastrophes. From what I can tell there are very few have problems with the faces, very few problems with the customization (something that was a high concern of the community leading up), and overall people really like running and jumping around as the new avatar. On top of that, unlike Super Hero Life III it appears to be able to run on most mobile devices just fine.

I’d say 0.1.0’s release was a success. People seem excited for where the game’s going - and I’m one of them.

Next Steps
Version 0.2.0 is all about customization and adding multiplayer. I’ll be spending the next week or so just planning out what exactly my next steps will be, as well as doing a few polishing updates based on feedback. In my next post I’ll outline that plan, as well as share any progress I’ve made towards it.

Thank you again for all the support, and thanks for reading!

Edit: Also Roblox made a documentary about me. That was an overwhelmingly positive yet surreal experiencel. Give it a watch here!

10 Likes