As things start to come to a close on this chapter on my life, I really wanted to sit down and just tell my story about this magical thing with the Experience Team here with Rooster Teeth. Some people may have noticed on social media that both Jarrett and myself released some behind the scenes footage of our AR, MR and VR work. Those early prototypes contain some of the cool looking experiences, features and content we built. Our time with Rooster Teeth, and our Experience Team that we built is coming to an end. Nobody got fired and there is no bad blood, It was just time for a change and it’s for the better. I wanted to write about this year and the amazing ride it's been from spending hours in the morning locked up in a conference room to having created one of the most fun mixed reality experiences and pushing how people interact with content. I do hope you enjoy this story and a personal thank you to everyone for being the most passionate community on this planet.
In the middle of production of RWBY 5, I kept having this single dream that played over and over in my mind. Every night it was the same, a giant robot just walking around a city. Looming over and casting this giant shadow as it blocked out the sun. Instead of fear, I felt excited and driven to see this happen in my lifetime. How do you bring a giant mech to the city so everyone can see? I knew after the dream kept happening, I needed to make this a reality.
By now, Gen:Lock was just slowly becoming more than just an idea filled with giant mechs and crazy scientists. I would read the pitch doc over and over again, just enraptured with its tech and I wanted to know how much of that is even remotely possible, Some of the concepts that are littered in their world is slowly starting to come out of its infancy. It was like adding a pinch of sugar to a fire, Gen:Lock was a way to make that dream come true and a lot sooner than I expected.
At the end of December of 2017, I spent a lot of my after hours time looking at the infancy of the concept of mixed reality. With Pokemon GO, we saw the power of augmented reality swept the world by storm. Since AR is more just overlayed on top of your world, it didn't truly come to life, you couldn’t walk up to a Pikachu and take a close up picture with them at the time. I wanted to go a step further. What’s the next step I need to make this happen?
I spent a lot of time working on how assets traveled through the pipe for the production of RWBY4 and RWBY5. Taking that with the knowledge of Unreal Engine I had previously, I started working on building a functioning pipeline to bring assets over to Unreal efficiently and without loss of quality. I worked with Apple ARKit system which had just recently released with no real applications out there yet taking advantage of its power. I brought Zwei from RWBY Chibi into Unreal and worked on the two first skills you teach a dog; sit and stay. It took a lot of trial and error, building formulas and starting from scratch again and again, until one night, Zwei learned to sit and stay.
I started laughing like I heard the funniest joke in my life, I’m not one of original members of the animation team, I remember watching the Ruby trailer back in Chicago while in college. I was a fan and to see a character from a show I both admired as a fan and loved as something I worked on, it brought the biggest smile on my face. Zwei was now in our world, I could see him, I could get close to him, I could walk around him. Then I asked myself my favorite question; what’s next?
The possibility of this tech just kept flowing through my mind and next thing I wanted to tackle was a tool to help see and review assets for the directors to be available to view and review before they even hit the pipe. At this point, other people in the office began to hear about what my crazy mind was doing those late nights. I showed Brian, one of our production coordinators and he immediately took my tablet and ran straight to Kerry with it. Then it kept traveling up in the animation team and I knew I need to keep pushing what was possible before anyone else can see.
At this time, I brought in another person to this, as he will call it, “fever dream”. Jarrett started volunteering his time after hours to look at ways to improve what was in front of us. It didn't take long for me to start corrupting him when he used to tell me “no, we can’t do that” to “oh wait, I have an idea”. Without him coming into to project, I would still be just writing ideas on a wall at 4 a.m and tinkering, pushing and building random ideas.
We brought in a new character and I loved that one late night, Gray took a selfie with that character and sent it up to the higher ups in the company. We wanted to continue to ask ourselves what’s next? What is our next step? We started bringing in short and small instances of sets, kind of like a Mario Party game board come to life, animated water, sounds, just pushing what was possible all on just a single tablet. I had one of my most “horrible” ideas because once it hit, it only took 5 mins to do in this pipe. Since I was using a rudimentary system of tracking to create what we know now as 6 degrees of freedom (tracking the position and rotation of the tablet) I could make a set I could make life size and walk around in this virtual environment.
Weiss’ bedroom was brought into engine due to its small size, the beautiful lighting work done by our animation post team and mainly cause of the giant window that looked out over the mountains. It was a magical moment, like I had ripped opened a hole into the fabric of our reality and now I am apart of the world of Remnant. It was then we knew that night, that the possibilities were really truly limitless. We kept redefining our work and I decided it was time to make that little dream in the back of my head possible. I found a super early concept of a giant mech and decided we need to see this happen. One late night, Jarrett, Gray, Koen, Kerry and myself were standing outside in the cold night in the middle of the parking lot, looking at this massive robot. Finally. That little dream that kept playing in my head became so close to reality in just one month. What’s next?
We continued to demo it in front of other people so they could see the tech and see how well our content could create this whole new way for people to interact with loved characters. I remember being in the mocap office where Gray and I was walking Gus through the application and just like everyone else, ideas kept flowing. “What if we did this?” “This would be cool”. They wanted us to pitch this to the top brass of the company. I was hesitant and nervous about it, but with my public speaking and theater background, I was going to have to do the talking, I sat in our little makeshift office that we used every night in the animation department and Koen would just rip into me to prepare me. Asking so many questions and aspects and to be confident. The tech speaks for itself. He reassured me that it would only be about 5 to 6 of them in a small little boardroom and I felt like I would be doing my best Charlie Day impression explaining a crazy conspiracy to others.
The morning of the pitch, I found out the hard way that it did not have 5 to 6 people and it did not take place in a boardroom. I walked into the miniature movie theater and walked all the way down to the “stage” at the bottom of the theater. Looking up at the blinding light of the projector I could make out easily 15 people. I had cut the video together the previous night for everyone to see each of the experiences mentioned above and had my tablet with everything on it so I could have people experience it themselves. I knew with most pitches, most people would have a pitch doc with artwork and story outlines or an idea of how an application could theoretically work. I don’t think anyone was really prepared to see a working example.
Jarrett and I walked out of the room and we looked at each other and just screamed loudly. Like having a huge weight lifted of our shoulder, before we even got back to my car, Miles had slacked us both “Great Fucking Job”. I knew deep down that we did it, we were going to start doing this as kind of thing on the side but more officially. With the weight lifted, I could think more clearly. As good as that sounds, its not. It means that my mind is off to the races again to start thinking of that ever so sweet question. What’s next?
By March of this year, machine learning was something that began hitting consumers in waves and the possibilities of what that brought to the market. From computer vision, prediction and image classification, many developers began digging their teeth into different aspects for their use. For me, it was like adding peanut butter to my chocolate. I went back to that little meeting room space and starting planning out ideas for how we can start merging our tech to other areas other than animation.
I don’t know if anyone else noticed this yet, but Rooster Teeth makes some super cool merch. From t-shirts, posters, mugs, blankets, they make some of the best products. With movies like Ghost in the Shell, the Blade Runner series, and countless cyberpunk stories, we know the concept of immersive advertisement and how to gain more information and entice you to buy some products. For me, I wanted people to be able to see this awesome item and learn more about the content, and know exactly where to get it. Luckily, I knew just how I could achieve that.
We were both waiting for the results of our initial pitch and still riding that wave of adrenaline from the experience of the pitch. Instead of idling our thumbs, we went to work. We began working with both Tensorflow and CoreML in order to begin development on image classification, and started training multiple models to handle this issue. I started work natively in XCode to work with the bare roots of ARKit in order to integrate the models easily and also still have our AR work as intended.
We started with posters because they were vastly complex compared to other things available to us. In just a few days, I had one of the RWBY posters able to be detected on a phone. In just under a week, I was able to spawn a “TV” on the poster showcasing the RWBY 4 trailer. After that, I was able to have a second TV pop up with touch screen controls that would link directly to the product on the store.
One of my favorite memories about this project was when I started working on shirts, I had to find people around the office who was wearing our merch that I could make them stand awkwardly without telling them what was happening. I saw Cesar, one of environment artists for RWBY, wearing an older shirt and thought to myself perfect. I trained the image classification to detect his shirt, I asked to him to turn around in the kitchen and just stand there. From there on, Cesar became our unofficial model for all of our content. You can see that footage exactly in some of the demos on both our websites.
We sent in that footage to our internal marketing team and it kind of kept that flame alive in everyone’s mind. Everyone knew what we were doing. The only problem was that we were still waiting for feedback on our pitch. Still no idea what we are doing with this tech. By now, RTX was starting to ramp up planning for the Animation Festival, We knew that we were planning something big for RTX this year in order to show everyone this tech and we needed to be on the ground floor for the planning. We joined up with the planning team runned by members of Screw Attack and we worked hand in hand with them in order handle traffic, event planning and marketing. I had one goal in my mind and RTX was the perfect stage for it. Everything has been practice to this point.
It was time to see that mech walking around downtown Austin.
By May, we got the word. We were being pulled from Animation in order to be allocated to a special projects team of just the two of us working directly under Gus. We started planning out what we would be doing for RTX in Austin and to create something that was portable enough to be taken elsewhere for other conventions. It came down to the idea that we wanted to immerse the user in the world of RWBY, have them explore the full expo at each location, and be able to share their experiences with others. So the RTX Experience was ready for production.
I switched our production to the Unity engine in order to have a deeper control of the engine, finer controls over the assets and Unity’s quick adoption of updates to ARCore and ARKit. While I still love Unreal Engine, I wouldn't use a saw to hammer in a nail. I spent time working with our animation art director for both Gen:Lock and RWBY in order to make sure I got the art style across. This required rewriting a cel shader solution and outline solution from scratch to match an art style beloved by people. We worked with various department for our icons that we used, I designed the UI and worked on assets and animation along with apple side of development.
One of the biggest things that we did was our interactive crowd simulation tucked away in a hidden room at RTX. We wanted to showcase what sets us apart from everyone else. As someone who worked in crowds for parts of 3, all of 4 and 5 and leading into gen:lock, I knew the logic, behaviors and how we can make a very immersive world. I brought the Menagerie set from RWBY 5 and we spent a month getting it right, from the amount of Faunus walking around, to the skybox and lighting. Jarrett seriously knocked it out of the park with his pathfinding and collision detection based on that crowd logic. Finally, when we were able to boot it up on an iPhone, characters would walk out of the way of you, you could move in physical space and it would reflect in this little window of this world.
Because of us working with Screw Attack in planning Animation Festival, we also lend a hand for the Rave saturday night. Chad had this amazing and horrible idea of having the RWBY girls perform in AR as the opening act, similar to a Miku vocaloid performance. We needed dance animation and Neither myself or Jarrett are light on our feet but luckily Chad had another idea that could work really well. Yssa, our lead 2D animator at the time was happy to do a new mocap the dance from RWBY Volume 2. I wanted to redo the mocap in order to test a new thing that was coming down the pipeline for us. Directing the mocap sequence was also something I always wanted to do and it was fun to finally do it. We decided to apply it to Ruby and two beowolves to act as back up dancers, knowing it would make everyone laugh. When the night came and the performance was shown, listening to everyone laugh and cheer something i'll never forget.
RTX was also going to be the premiere of Gen:Lock and its Holons (giant mechs in the show). I needed this dream to happen.
After digging around, I reached out to the developers at Mapbox, a company that specializes in open source mapping and included an integration to Unity. Jarrett work on the magic to adapt the same crowd logic to a world scale solution. Outside our new office over at Stage 5, we had a holon that could walk around Austin, but we hit a pretty big snag. We couldn’t get occlusion working as intended and with RTX the very next week, I had to do a hot swap. In about an hour, I swapped it to the giant Sea Serpent Grimm from volume 4, one of my favorites, and it flew around Downtown at the Austin Convention Center.
As for the Holon, I didn’t want to toss that work out the door. So as an incentive, If you found every item, went to every location, got your hands on the rave performance, you had a very special surprise waiting for you that would tower over you in all of its anime glory. So with RTX done, we did some minor things for London to include Camp Camp and RWBY Chibi. But after that, what’s next?
This is where we catch up to right now, What is next? While we had some projects in various stages of production, ranging from some pretty cool at home VR experiences, field VR (like you see at The VOID) to some major AR and MR experiences that I’m still holding on to . We did set up the foundation for some pretty cool live action work for virtual production based on the new Unreal pipeline I helped build for Death Battle. Hopefully you will start seeing some of that put into production very soon. As for the Experience Team, we went from a dream keeping me up at night, to seeing our amazing community posting their findings and taking pics with their favorite characters. While my time with Rooster Teeth is coming to an end for now, I’m always going to be proud with the work I’ve done with Animation, and with creating the Experience Team.
Let’s go find out what the next chapter is.