Experiments come together
In order to bring together all the content created over the last month, I created a short story teaser. It is a a “teaser”, because the motions I have from the labs are very diverse and not ideal for a continuous play. At the same time, as part of the explorations I have been doing I used the 3D scan of myself which adds a different suite of problems (extensions of meshes without textures, weird positioning and rotation of joints, etc). Another issue is that not all the motions were performed by me, so the mapping causes trouble by itself. Finally, because of the timing, the lack of focus and coordination between the labs and this work (further worsened by my own lack of definition and satisfaction on what to do on Unreal until literally last week) and the less-than-ideal animations, my camera movements were restricted and I didn’t have enough time to do too many reshoots. Hence, the movements and camera cuts are definitely not good and there are some issues I barely could hide, but now I’m more aware of how to do this better.
For the future, I know some things I have to deal with in order to prepare better and other things I still need to solve:
- Plan the camera works and needed motions for the project. This can be solved with storyboarding, storyboarding and more and precise storyboarding. Also, get extra animation captures, just in case they might be needed.
- Learn how to have animations “stay in place”. I couldn’t find the option this time so I had to work around it, but it meant that I couldn’t blend motion capture animations.
- Learn how to render from the sequence cameras. I tried using the renderer, but those videos looked awful: bad quality and terribly laggy.
- Get the Niagara Particles plugin to work. I spent a long time trying to follow this tutorial and get this plugin to work to no avail. I thought one problem was me using Unreal 4.19, so after a while I created a copy on 4.20 (the one used on the video) but that one didn’t even open, it crashed every time while trying to open.
- Dragon Scream, by qubodup
- Flying dragon, by Robinhood76
- Big Lizard Monster, by -sihiL
- Rumble, by unfa
- A Subtle Betrayal, by SYBS
- Introspective Spacewalk, by Asher Fulero
- Spine Chilling Cardiac Tension, by Biz Baz Studio
A very glitchy (and expensive) toy
For this lab, we learned how to set up the streaming of rigid bodies and how to use the Perception Neuron for motion capture.
We had some minor issues with the rigid body streaming, but managed to make it work anyway. First, we tried to create a Blueprint Actor for it, but it was not ideal nor straightforward as with the GUI. But what was more troubling was the camera issues. While using a rigid body as the camera mount is amazing and the possibilities it gives are widespread, the translation of the movement and rotation is troublesome. When moving up, the camera rotated in two different axis, which tells us that the zero rotation of the rigid body did not match the “real world” ones. I’m sure this has an easy fix, but we could not find it on the lab.
Now, the problems with the Perception Neuron were more than “some” and far from “minor”. The first set-up took a while, changing the sensors over and over again and reconnecting as many times. We got to see how fragile it is: after a handstand or some gymnastics, it completely lost the calibration. But it still worked and it is nice not having to count on the whole room setup. Sadly, the hand tracking part was even worse. We could not get anything to work, and for now it is the only way to get hand tracking we have (as far as I know). But I feel like it can be attributed to a hardware failure more than a lack of ability to set it up, so I’m hopeful we get a new working one (ahem).