Hi From The Future is here, ready to save you from production woes
Interview: Elliot Higgins & Mark Rubbo from Brooklyn Studio
Hi From The Future isn’t a time machine movie, but a Brooklyn studio that produces interactive content that ranges from Spike Lee and Cadillac to Pharrel Williams and Netflix. Recently, we Zoomed with founders Elliot Higgins and Mark Rubbo to discuss how they produced the video for Duckwrth’s “Find a Way” in five weeks—despite pandemic protocols.
Q: First, a little about your studio. When did it form and can you give some info on the way it came together?
Mark: Elliot and I are longtime friends who had a vision of a creative studio that uses new tech to solve age-old production problems. At the time, the projects we were working on didn’t seem to be optimized for a 21st-century media landscape. What was needed was a new kind of studio that could create content across digital channels and could act as a partner for larger studios, brands and artists who needed such work.
Q: How important are real-time technologies to your studio? How important do you feel they are to VFX and why?
Elliot: Real-time technology is essential. Tools like Reallusion’s iClone and Character Creator extend the creative period. Because we can iterate almost until delivery, we sidestep the limitations of the traditional production, where you are forced to lock in creative in the early stages of development.
Q: How did the project for Find A Way come to be? Can you explain a little on how your studio became involved?
Mark: The short answer is Resolve Media’s Chad Tennies, who later became co-director on the project, brought us the opportunity: five weeks, exciting, young artist, could we pull off an animated shoot despite the lockdown and limited access to the artist? Of course, the real answer is longer, and more complex. “Find a Way” is a project we had been preparing for for years. All the research and testing we did on different software platforms and figuring out a proper workflow was key. So was all our experience on traditional shoots, which we adapted to a virtual production.
Q: Can you give me an idea about your process for pre-viz? Was the project storyboarded ahead of time, or did you make it up as you went along?
Elliot: Due to the five-week crunch, we didn’t have time for traditional storyboarding. Mood boards, style frames and other reference material was the substitute for creating a through-line in the production and assuring that the vision was consistent with expectation.
Q: How did you make the characters? What software did you use? What role did Reallusion’s Character Creator play?
Mark: We started with Zoom calls with the artists, where we got a general sense of their style and what they wanted to bring to their performance. These sessions became mood boards, from which we began developing the characters. The advantage of working with video game tools is that you never start from zero. Character Creator assets all came pre-rigged, which allowed us to apply custom motion-capture or dub over stock animations to give an extra layer of polish.
The character workflow went something like this: We used Character Creator for the base, then punched it up in ZBrush. Substance Painter was used for texturing skin and clothes. Reallusion’s iClone was where we did all the animation. Maya’s XGen for hair groom was next, before export to Unreal. When we had to make changes, we simply jumped back to an earlier step. Thankfully, we were working with stylized avatars, so we didn’t have to worry about photogrammetry.
Q: I believe that the facial expressions and performance were done with Reallusion Faceware tools. Is that so? Were there any other techniques? Did you manually animate any of the faces with morph targets?
Elliot: Yes, that’s correct. We used Reallusion’s LIVE FACE for the close-ups. We worked with artists over Zoom to capture their performances. The mocap provided a rough base, with manual adjustments tuning the details and accentuating key expressions. For more advanced features, such as hair, we had to pull out all the stops. We wound up using a new plugin from Unreal that supported real-time hair simulation. We hadn’t used the plugin before, but it soon turned into a great tool to render Duckwrth and Radio Ahlee’s beards, and to do justice to Alex Mali’s green curls and Bayli’s Bantu knots.
Q: Can you give us an idea about the animation? How much did you rely on motion capture? Seeing how this project was most likely created during the Covid era, what processes did you use for motion capture of the dancing?
Mark: Markerless motion capture sessions through Zoom helped us to nail down the performance sequences, whether the performers were dancing, singing, or rapping. Everything was 100% virtual due to pandemic restrictions, especially when we were working on this project last spring, during the first wave of infections. For each session, we had the artists download (Maxon’s) Moves by Maxon app and hit record. These sequences were later combined with stock moves, such as when Duckwrth jumps off the cliff (1:36).
Q: Regarding the environments, I believe that they were created, lit, and passed along to the final rendering all within Unreal Engine. Was that the case? What are some of the benefits to using Unreal Engine? After using it on this project, do you see it playing an important role in future productions?
Elliot: As with character development, working with pre-existing assets from Quixel Megascans and the Epic Marketplace gave us a major head-start in creating the environments. The objective was fitting environments to the storyboard to advance the narrative. It was also about replacing preconfigured design elements with custom assets that connect directly and personally with character.
For instance, there is the illuminated rune that takes us into the video. That object was taken from Duckwrth’s sigil, or personal talisman, used in the album art. It introduces the questing, adventurous tone we were going for, as well as making that personal connection to Duckwrth’s outlook and album art.
The process of curating and customizing environment assets combines level design for a video game with set design for a film shoot. The intent of changes was to communicate that the stock elements we used as a starting point weren’t fully environments. They were individual rocks, trees, textures, buildings, and other features.
The shoot in Unreal was both the easiest part of the process and the most stressful. After bringing all the major assets into our main staging level, we spent the final week nailing down camera angles and making sure the tone and quality of the piece were consistent. Since we were dealing with technology designed to do something different than what we wanted it to do, slight problems were constantly arising and our team had to keep refining and debugging. The ideal we were shooting for throughout was real-time decision-making and live changes.
Q: What were some of the challenges you faced working on this project? How could real-time production be better?
Mark: Beyond the extreme time-constraint and the pandemic protocols, the major challenges we faced involved collaboration between the character and the environment teams. Multiple artists working on the same shot would be ideal. For us, real-time production could improve by better approximating the collaborative dynamism of an actual film set.
Q: What cool projects are coming up at Hi From the Future?
Elliot: The “Find a Way” project has been a great conversation-starter for us. People who were resistant before the pandemic are now embracing the possibilities of what can be accomplished with real-time technology. As a studio, we are working on our own original content, in addition to adding our expertise to Twitch, VR game, film, and TV projects. What we predicted a few years ago has come true–and it’s a lot of fun!