Framestore takes FLITE with Unreal
FLITE director Tim Webber discusses merging together two pipelines to make the ambitious short film.
An image of a girl on a hoverboard, teetering on a window of a luxury high-rise building deciding whether to breakaway in a semi-submerged London of 2053 and the concept of a Memory Investigator provided Framestore’s VFX supervisor and creative director, Tim Webber, with the chance to create and direct a short film utilising FUSE (Framestore Unreal Shot Engine).
FLITE was shot over five days and assembled into a 14-minute sci-fi drama. It revolves around a stranger trying to assist a young female champion hoverboarder attempting to leave her oppressive manager by recounting what he saw to local law enforcement.
Providing a guideline for the production methodology was the blockbuster feature film that saw Webber rewarded with an Oscar and BAFTA Award alongside Chris Lawrence, David Shirk and Neil Corbould. “We did a lot of things on Gravity that opened my eyes to the possibility of working in that way and using current technology to make it easier, better, quicker, and less expensive,” states Webber.
“One of the key things from Gravity was doing previs for the whole movie and going beyond normal previs, so you have the ability to make a more informed judgement of how well it works, even before getting to shooting. Expanding that to the whole process enabled us to change the creative flow of how you develop the creative aspects of the story.
“Because we completed everything right to the final pixel in Unreal Engine, it meant that we could work up the previs with high-quality assets, and do further work in animation that you might not normally have done like designing the final lighting. You’re seeing the whole movie in a much more complete form.
“As you shoot, performances can quite easily be dropped in. As you animate, everything is live thanks to it being in Unreal Engine. You have a lot more context than you would normally have to make judgements about how the whole thing works.”
Virtual camera and location scouting were conducted in real-time through an iPad or VR headset. “I put the actors in the VR headsets so they could understand and explore the environment, whether it be the apartment or the city,” remarks Webber. “Because everything is more interactive and immediate, it enables lots of people working on various aspects to be looking at stuff in more context, and you’re all looking at the same thing together. Someone is tweaking the lighting while another is modifying the animation.
Get the Creative Bloq Newsletter
Daily design news, reviews, how-tos and more, as picked by the editors.
“Very quickly you see it all together and you’re looking at something that might not be final quality, but gives you a good judge of how the end product will look and is more responsive. You can collaborate much better under those circumstances.”
Lens aberrations and glitches had to appear to be organic and accidental. “That’s achieved through craft! For glitches and lens flares, we did a lot of work to embed a proper compositing workflow into the engine,” adds Webber. “That took a lot of additions to the engine to enable that to work as a filmic way for the compositors to get all of the elements they needed out of the renders to be able to work in Nuke.
“This enabled us to do more sophisticated, bespoke crafted lens flares and more interesting glitches. Also, there are the benefits of real-time. You’re wandering around with an iPad and discovering camera angles; it’s much more organic and you can have happy accidents.
“It enabled us on-set with the actors to have the freedom to do whatever we wanted. We’re not limited to what we previs. Everyone could understand what the shot was going to be because we were working in real-time and it was giving immediate feedback. It’s not like wandering around a bluescreen not being sure what you’re looking at.”
A shift in perspective from third to first-person occurs with a mirror reflection that altered how shots were designed. “That added complexity because it made the shots long and continuous,” says Webber. “It’s a story told through the memory and point of view of Jones (Daniel Lawrence Taylor), which meant long, continuous shots that required complicated and choreographed setups. The chase sequence on the bridge would be incredibly complicated to do on any film, let alone on a short film budget, but using this method we were able to do it.
“When Jones, the window cleaner, is putting his glasses on and then it’s through his POV, there are times when the camera is portraying his personality because you want to be aware of his presence. Then there are other moments you don’t particularly want to be aware of his presence, and we seamlessly segue into a much more cinematic camera that is dollying along watching Stevie (Alba Baptista) ride across the studio, and having an argument with Johnny (Gethin Anthony).”
Most of the time there is a live-action component to the characters. “It would have been a huge amount of work to do closeups of CG faces and get them to feel nuanced, real and human. But by using live-action and CG at the right time, we managed to make it much easier to achieve and a better performance than doing it other ways.”
Virtual production was part of the process; however, as a point of reference. “Even when we were filming the actors it wasn’t typical LED volume in-camera visual effects,” says Webber. “We didn’t do any in-camera VFX or capture the background. The LED volume wasn’t providing the final image, but was useful in giving us the actor, setting and lighting, which made it all come together well.”
The Tower Bridge chase was complicated, because you want to see a combination of reaction and wide shots within the framework of a single long, continuous take. “Choreographing it was quite a challenge,” Webber admits. “I did basic storyboards from the beginning, but quickly we got into previs where most of it was worked out. It was carefully planned and worked out what bits we needed to capture of her.”
The project was extremely ambitious despite the restriction of a short-film budget. “We tried to do a bunch of tricky things because I wanted to show that this method would enable you to do that,” Webber adds. “It enables you to plan in a way that there’s less wastage and you’re more focused on what you actually need to get the film to work, so more of the work goes up on the screen.”
Artistic FUSE
The Framestore Unreal Shot Engine [FUSE] pipeline leverages past expertise with real-time game engine technology
Whereas animated shorts created in Unreal Engine tend to be made by small groups of artists, FLITE was an initiative to see if the process could be scaled up for a feature or television production. Webber says: “FUSE is a pipeline based on decades of building traditional VFX pipelines that enable you to work at scale on thousands of shots, and brings together hundreds of artists with their own specialties in ways you can’t do if you’re working in Unreal Engine. We built these tools and pipelines on top of Unreal to bring a team of that size and complexity together.”
A hybrid approach was adopted. “We have animators with decades of experience, but they’re not going to be trained to work in Unreal because its animation tools aren’t there yet,” explains Webber. “They can continue to work with the animation tools they’re used to in Maya, but are working with the real-time game engine toolset that is part of our Unreal pipeline. Theo Jones, the visual effects supervisor, and me, the director, got immediate feedback in Unreal and were able to comment on the animation in what used to take a couple of days to do a bit of animation, render it out, and have it comped. With FUSE that could happen in 10-20 minutes, and we were able to give much better feedback because everything could be seen together in context.”
This interview originally appeared in 3D World magazine, the world's leading digital art, CG and VFX magazine. 3D World is on sale in the UK, Europe, United States, Canada, Australia and more. Limited numbers of 3D World print editions are available for delivery from our online store (the shipping costs are included in all prices).
Thank you for reading 5 articles this month* Join now for unlimited access
Enjoy your first month for just £1 / $1 / €1
*Read 5 free articles per month without a subscription
Join now for unlimited access
Try first month for just £1 / $1 / €1
Trevor Hogg is a freelance video editor and journalist, who has written for a number of titles including 3D World, VFX Voice, Animation Magazine and British Cinematographer. An expert in visual effects, he regularly goes behind the scenes of the latest Hollywood blockbusters to reveal how they are put together.
Related articles
- Just in! The new MacBook Air (2024, M3) has a massive 23% off in this scorching Black Friday deal
- Why Sword of the Sea's animation looks so silky smooth
- I'm hunting MacBook Black Friday deals live – get a surprise discount on the new M4 MacBook Pro
- Tottenham Hotspur FC's nostalgic rebrand was a "labour of love"