Becoming a Butterfly

--

Part I: The Body

Butterfly player avatar mockup

This month’s NatureSims post is focused on the creation of the monarch butterfly player avatar, or the butterfly body players will occupy when they enter the game. This will be a fairly high-level overview of this process, so don’t expect lots of code snippets or details about various Unity settings. (That said, if you have specific questions about any of these things, feel free to ask!)

The traditional way to handle this sort of thing would be to do it in the third person, like having a camera floating over the body of the monarch butterfly you are controlling as though you were looking on from above. But that’s not what we’re going for here. In order to make full use of our VR capabilities, we want a first person avatar that allows players to see the virtual world from the perspective of our (virtual) butterfly, and interact with it in much the same way that a butterfly would.

In other words, when you put your headset on and start the game, your head (in the virtual world) should be where the butterfly’s head would be, and if you flap your right arm up and down like a wing, the right wing of the virtual butterfly should move accordingly. If you turn your body to the left in the real world, the butterfly’s body should also turn to the left in the virtual world.

Here’s a short clip of this month’s prototype — the grounded monarch:

There are three basic steps to this process: (1) creating the butterfly model using Blender for 3D modeling, (2) creating materials for the model using Substance Alchemist and Painter and (3) integrating VR functionality into the model using the Unity game engine. While I’ll discuss them in a linear fashion below, it’s worth noting that this wasn’t actually a linear process. In reality, there’s a whole lot of iteration going on, i.e. creating a basic model in Blender and then testing it in Unity, then making fixes to the model and testing it again, then trying a first-draft of the materials and testing again, repeat, repeat, repeat.

Step I: Creating the Geometry

Before we can do anything in VR, we need to create our monarch 3D model. To save time, rather than build the whole thing from scratch, I purchased a low-poly 3D model of a monarch butterfly that was rigged — meaning it had an armature, or bones we could use for animations. I then modified it using Blender to look and work the way I wanted.

For my purposes that meant:

  • removing the head (this is where the VR camera will go),
  • detaching the antennae and proboscis (the straw-like butterfly “tongue”) so we can control them independently in VR, and
  • modifying the rig/armature and wings of the model to work well with VR hand controllers.

Trending AR VR Articles:

1. The Possibility of AR on the Urban Space

2. Enter The Monarchy

3. The Exciting Applications of AR and VR in Automotive

4. The Best VR Events and Concerts Planned for 2021

Removing the head is relatively straight-forward — we want this to be an empty space where we can put our VR camera in Unity so when a player puts the headset on, they are looking out from where the butterfly head would be. We detach the antennae and proboscis for similar reasons — we want these to move with the player’s actual head in Unity. So we make them into separate meshes in Blender, and we can then child them to the VR camera in Unity (so when the camera moves, they move with it, maintaining their local position relative to the camera). I also added a custom pivot point in Blender where the head should go, which we’ll need to move/rotate the avatar model in Unity.

The next part was a bit more complicated. I needed to modify the mesh so that the dimensions of the wings were more accurate; the wings themselves had thickness (they were flat planes to start), particularly around the tubular veins that run through the wings to provide support; and rebuild the rig/armature of the model so that the wing bones could be animated using inverse kinematics (see part III below) in Unity with the hand controllers as targets. I used scale images of adult monarchs to recreate the wings and trace the paths of the veins.

Monarch avatar wing creation, shown in Blender

I then made the veins into 3D “tubes” that run throughout the wings, as they do for real butterflies to provide the wings with nutrients and stability. (Note: in the final model I did this only for the forewings to reduce the number of tris.)

I also spent quite a bit of time playing with the wing mesh to try to get regular edges running through and lining up between the fore- and hindwings. This is very important to get the wings to animate properly as they’ll only be able to “bend” along these edges. So if the edges/vertices are arranged more haphazardly the wings won’t appear to move correctly when they’re flapping.

Monarch avatar geometry, shown in Blender

Once the geometry was ready, I did a few quick mock-ups showing how I anticipated the VR avatar to fit together in Unity. As you can see in the images below, the butterfly avatar needs to be quite large to fit the height of the player (using floor-level tracking in Unity). Building it in this way means that the player’s head will be where the butterflies head would be and the player will perceive their actual feet to be at the same level as the butterfly’s feet in the virtual world, improving the level of immersion.

The player’s hand controllers will also be able to control the positions of the butterfly’s wings, so when the player stretches their arms straight out, the butterfly’s wings will likewise stretch straight out. Raising the right arm will cause the right wing to rise accordingly. The wings will simply follow the locations of the hands, with limitations added to each joint of the wing bones so that they cannot move or rotate in an unrealistic way.

Finally, I used Blender to unwrap the models, creating a UV map that I could use in Substance Painter to create the avatar materials: one for the body, and another one for the antennae and proboscis.

Step II: Creating the Materials

One of the challenges with NatureSims projects has been that the scale I’ve been working at makes it harder to find good 3rd party assets to use. There are hundreds of options out there for plants and other miscellaneous objects online, in the Unity Asset Store or elsewhere. But a virtual flower that looks nice when you’re the size of an average human generally looks pretty terrible when you shrink down to the size of a butterfly. What might have started as a “AAA” asset ends up looking more like a pixelated model from the 1990’s.

The same is often true for the materials and textures you can find. They just don’t look right because we’re working at a substantially different scale. So, before jumping into Substance Painter to paint my models, I took a quick detour to Substance Alchemist to generate some of my own materials.

For this project, players can’t actually see too much of the monarch’s body, since it’s more or less directly behind the player camera. But the player does get a great view of the monarch’s wings so the wing material is particularly important.

Monarch wings, like those of other butterflies, are covered in tiny scales. While these scales aren’t discernible to the naked eye, you can see them clearly under magnification.

Magnified Monarch Butterfly Wing Scales. Image Credit: Raul Gonzalez

This is important for me because, if you imagine a butterfly scaled up to the size of a person as in the mock-ups above, the wing scales would be clearly visible. In images I’ve found they look almost like a soft fabric, as in the image below.

Magnified Monarch Butterfly Wing. Image Credit: Annemaria Duran

Alchemist has a pretty amazing feature that lets you generate a material based on a single image. It also has tools to make that material tile well, or repeat in the x and y axes to form a continuous whole. So to create my “wing scale” materials, I used an image of a magnified monarch wing, and made a few fixes to make it tile well. Then I just had to modify the base color to generate wing scale materials in a few different colors, to match the colors on a real monarch wing.

Here’s a screenshot of one of my dark brown wing scale materials, shown in Alchemist.

Dark Brown Wing Scale Material in Substance Alchemist. Right is the base material, left is that material tiled over a rounded cube.

Once I had those materials ready, I brought them along with my Blender model into Substance Painter. Painter allows you to take a wide variety of individual materials, like the brown wing scales shown above, and paint them directly onto a mesh. You can then export the combined material for the entire mesh (in the form of a texture atlas) that you can use in Unity or even bring back into Blender.

Monarch Avatar Wing Material Close-up in Substance Painter

For the monarch avatar, I used my wing scale materials for my butterfly wings and a few other materials I found online for the butterfly’s hairy body and dark, chitinous lower legs. I spent the most time on the wings, since they’re the most important, and painted the antennae and proboscis separately. You can see a close-up of one of the wings along with a top view of the entire body in the included images. This won’t be the final version, but works well enough for this prototype.

Monarch Avatar Material in Substance Painter

Step III: VR Integration

Once the materials and the model were ready (or at least ready enough to test), I brought everything into Unity to begin putting it all together and adding the VR functionality needed to be able to control the avatar with a combination of our headset and hand controllers. To get started, I used the Oculus OVRCameraRig prefab with the intention of adding Steam/OpenVR functionality later on (which is definitely a hassle: really looking forward to using the OpenXR standard for both in the future).

The first step here involved some initial calibration of the avatar — basically setting it up so that it was oriented and scaled correctly so that it would “fit” players of different heights. Meaning, when a person starts playing the game, the player’s head (i.e. the active camera) should be where we’d expect the monarch avatar’s head to be, and, when landed, the avatar’s virtual feet should be hitting the same ground plane as the player’s actual feet, which helps increase the feeling of immersion.

Avatar mockup showing proper “fit”

In the full app/demo, player height will be measured in an opening “calibration” scene, and the avatar will be scaled appropriately via scripting when the game begins, similar to what you see in many other VR apps. But, to make prototyping easy (and avoid loading scenes/transitions for the time being) I’m just adding the player height manually for myself and other early testers, along with their arm length, which we need to be able to determine how extended the avatar’s wings should be for flight mechanics to work properly.

Integrating some basic controller movements was the next step, allowing the player to move the avatar on the ground plane by using the left thumbpad/joystick, and rotate the player by using the opposite thumbpad. I’ve enabled OVR’s positional tracking, so players can also move the avatar by physically moving/rotating around their play area, but enabling the thumbpads makes basic movement quite a bit easier for players without large play areas.

One trick here is that the avatar body and the camera need to be moved and rotated independently. We obviously want the avatar body to follow the head/camera, but we want to make sure we can control precisely how it does this. So, for example, if the player looks down in the game (rotating the camera around the red X axis in Unity), the avatar body shouldn’t simply apply the same rotation. If you look at the mockup above, that would result in the avatar body rotating into the air behind the camera rig in an unrealistic way.

The Unity transform, storing an object’s position, rotation and scale.

Instead, I set this up so that the avatar body follows the camera, using the avatar’s pivot point that I added in Blender. Rotation is trickier. Basically, we want the avatar to orient toward the same forward vector as the player, but just as you can turn your head without turning your body, we want the player to be able to look around in the virtual space without necessarily turning the avatar body. As noted above, we also want to prevent the avatar body from rotating in an unrealistic way. Finally, we want the player to be able to select whether they’d prefer to use the orientation/forward vector of the camera or one of the hand controllers to control the rotation of the body.

For this prototype, I settled on using only the rotation around the green Y axis from the selected source (controller or headset), while retaining the avatar’s rotation around the X and Z axes. Basically, this means that the avatar can rotate around in a circle, keeping the body parallel to the ground plane, but it can’t rotate forward/backward or to the sides. This works well enough for the purposes of this early prototype, but it’s worth noting that it has a significant limitation: it only works on a ground layer that is relatively flat. If we put our avatar on a steep slope it will break, as the avatar won’t be able to rotate its body to keep it parallel to the sloped surface. An issue for another day.

Once the avatar could move around, it was time to start on animations. For this project, this includes both “FK” and “IK” animations, or forward kinematics and inverse kinematics. FK animations are the standard, and work more or less the same in Unity as they do in claymation. If you want a humanoid clay avatar to raise its right hand in a stop motion video, you take a series of images beginning with the base/resting pose. For each subsequent image, you move the right arm and hand just slightly until it’s reached the desired position. Then play the images in sequence to see a video of the clay avatar raising its right hand.

This works much the same with digital avatars in Unity: you begin with a starting pose, and record the movement and/or rotation of the avatar’s right arm bones over a set number of frames until the right hand has reached the desired end position. The total number of frames and the playback speed (frames/sec) then determine the length of the animation clip which can be played during gameplay and even blended with other animation clips. So, for example, a character can walk and raise its right arm at the same time.

Some nice things about these animations is that they’re versatile, relatively easy to set up, and computationally cheap (i.e. very low impact on performance). For this prototype, I’ve set up FK animations for the proboscis, wings (when the avatar is grounded) and default leg positions — all of which you can see in the clip at the beginning. I’ll also add FK animations to the antennae once I start working on approximating the perceptual world of the butterfly (with the antennae allowing the player to detect airborne chemical signatures like pheromones or the smell of nectar).

But they don’t work for everything. For this prototype, we want the player to be able to control the wings with the controllers, meaning that the virtual right wing, when active, should follow the location of the player’s right hand controller so that if the player makes a flapping motion (moving the right arm up and down parallel to the body), the right wing should flap in the game following the same trajectory. This is a job for IK animations, which allow us to take a starting position and a desired target, or end position, and to calculate the best way to reorient the avatar’s bones to arrive at that end position automatically.

For this prototype, I’m using the Final IK toolkit for this, which I highly recommend as it both performs well and makes the process about as easy as it could be. I have IK animations set-up to control the wings so that, when active, the target for the right wing is a child of the right hand controller that extends just beyond where the end of the wing is if the player’s arm is extended straight out to the side (otherwise the wing would curl in on itself). I’m also using a physics-simulation package, the Boing Kit, to make the avatar wings move more like real butterfly wings — more like a flexible textile than a rigid mesh as you see in the gif below.

Avatar wing using Final IK and Boing Kit

Finally, I’m also using IK animations to animate the avatar’s four main legs (yes, butterflies are insects and have six legs, but for monarchs the two in front are smaller and generally curled up close to the body) to get some more natural-looking walking/turning behavior.

That’s all for now — this month’s post turned out quite a bit longer than expected. Thanks for making it all the way through!

Up next: adding basic flight and gliding functionality to our avatar

Don’t forget to give us your 👏 !

--

--