The making of Elder Scrolls Online's epic cinematic trilogy
Blur director Dave Wilson and FX supervisor Brandon Riza discuss how his vision for Elder Scrolls Online's cinematic trailers grew into a trilogy.
Nominated for a 3D World CG Award 2014, we look at the making of Elder Scrolls' The Alliances, The Arrival and The Siege short cinematics.
How long did the project take?
Cumulatively, the campaign took about 18 months to complete, with upwards of 100 artists contributing in some way, shape or form – about 20 at any given time. We began work on ‘The Alliances’ in early 2012 and delivered in October of that year. I started writing the narratives for ‘The Arrival’ and ‘The Siege’ while still working on the first trailer, so there was a lot of overlap, with each installment upping the ante and building on the previous action.
How did the Elder Scrolls project start?
Bethesda’s creative agency, AKQA, had collaborated with Blur on a different project a few years earlier, so they were familiar with the caliber of game cinematics we bring to the table. They approached us with a loose outline for a 3-minute trailer highlighting the various factions in the Elder Scrolls universe. We then fleshed out and extended the story.
How did the story develop?
Near completion on the first trailer, we began discussing the possibility of turning the campaign into a trilogy, giving each faction its own trailer. While the concept for ‘The Alliances’ was fairly well established when we got involved, the next two were far looser, which gave us a lot of creative latitude to push the scale and scope. We constructed the narratives for ‘The Arrival’ and ‘The Siege’ at the same time, so we had an idea of where the story was going, which allowed us to make more informed creative choices. Bethesda and AKQA provided a few key moments, then we went back and forth on ideas to fill in the blanks.
How did the trilogy develop?
The project didn’t originate as a trilogy but began heading in that direction as we wrapped ‘The Alliances,’ which is when we started mapping out the bigger picture. We’d already done the heavy lifting as far as asset and character creation with the first trailer, so we were able to focus on taking the story further than we’d ever hoped. With the bulk of the initial design and modeling complete, we could devote additional resources to things like developing simulation loops for the mass destruction scenes and fine details such as the movement of the Altmer elf’s hair.
What challenges did you face?
Our biggest challenges were more artistic than technical, and in my opinion, the end-result is some of the best work we’ve ever done. Our team had to split screen time between three, almost four, characters and weave the storylines together in a way that didn’t marginalize one character over another. Personally, I enjoy sailing into the uncharted, and challenged our team to create something new and exciting. Still, we had to ground each cinematic in the reality of the game, heightening the experience while also making sure the action was plausible. We have a few of diehard Elder Scrolls fans on staff, and they helped us stay true to the IP. Also, the trailers needed to tell a cohesive story and feel like a logical progression, with each installment a compelling standalone piece as well.
What pipeline or workflow did you need to adopt?
For the most part, we used our typical workflow but made some modifications to address specific challenges and fully realize the story. We found new methodologies for working with hair and cloth and set the framework for a proprietary crowd system that has been put in place on a larger scale within Blur.
How many people worked on the trilogy?
Our team was about 20 at the peak of production but over 100 artists worked on the project. At any point, we’d have around 4 people working on previz, 15 people on animation and 15 on lighting – but that wasn’t all at the same time.
What tools and software did you use?
We used Autodesk 3ds Max for previz, modeling and lighting with a little bit of Mudbox for modeling and shading; MARI and ZBrush for character sculpting; and Softimage for rigging and animation. We rendered in V-Ray and composited in Digital Fusion. RayFire/PhysX, Thinking Particles and FumeFX played a big part in the creation of our big destruction effects in the second trailer, and we deployed an insane hair pipeline using Ornatrix. For the third trailer, our team enlisted a motion capture studio and stunt coordinator to lay the foundation for some of the action.
How was The Siege's final battle planned?
The original concept was to follow the journey of a scamp in one of the ‘Siege Bombs:’ up on the battlements, getting into Trojan horse like artillery, the pod splitting open then rushing after him as he made his way to the Nord, all the while catching glimpses of the battle as we go. After that, we just had small narrative elements that we needed to convey through the journey. We went through the usual boards and previz process, meeting often with both animation and CG supervisors on how to handle the layers of characters that we needed.
And what did you struggle with?
In every production there are often technical hardships that need to be overcome, but those hardships are more than often matched with “wide-eyed high-five moments of jubilation” where we remind ourselves why we love what we do. The helicopter flyover of the giant battleground was one of those moments. That shot was NOT planned until Jerome Denjean (CG Supervisor) put a test together of what he just THOUGHT we could do. It was incredible! We had no shot like that in previz and immediately we all realized “it’s got to be in there” so we went about making sure that it was.
How did you make the crumbling wall in The Siege?
(Answered by Brandon Riza, FX Supervisor) I created a target-seeking branched spawning particle system with both Thinking Particles and Particle Flow, getting slightly different yet equally interesting results.
I meshed these systems with Thinkbox Frost, taking advantage of the robust feature set it has to offer to create a plasma-like renderable object, which was subsequently XMeshed. This branching lightning system terminated in distinct points of impact, at which I randomly distributed FumeFX simulations (explosions) and RayFireCache RBD simulations (character debris).
I was able to duplicate and scatter these preset systems at the termini of the branching particle systems using a tool we developed at Blur for strictly this purpose. The end-result was total destructive chaos across a battlefield populated by characters cached into alembic meshes. For the keep walls, I used RayFire to Boolean auto-fragment the geometry and the Bullet sim engine to sim out 1000+ contiguous frames of rigid body dynamics. I, of course, added FumeFX and particles to flesh it all out.
Additionally, I created 8TBs of FumeFX simulations as library assets that I distributed to our entire Scene Assembly team to use as set dress elements. I always love watching what happens to the project aesthetically when 15 guys start adding billions of voxels to entire scenes of shots...
How did your relationship with Bethesda help?
Between Bethesda, AKQA and Blur, the creative process was highly collaborative, and our sensibilities aligned nicely. It was more about figuring out how we could best serve the story, rather than following an explicit set of rules; however, finding the right balance, was key. We wanted to push the limits creatively but not so much so that the action became too far removed from the game. Bethesda provided a lot of in-game assets that we used as concept art, and AKQA designed two of the main characters, so we had a solid frame of reference as a jumping off point.
Do you have a favorite scene or character and why?
The amount of craftsmanship that went into every frame of this show is staggering. The result of which is an embarrassment of riches too plentiful to unfairly single any shot out from. There are simply too many shots, scenes, moments that leave me floored. Every character was slaved over in a constant drive to improve from the last. It’s a wholly impressive achievement.
Did you create any shortcuts to help production?
Not so much shortcuts, but at this point in the trilogy we’d really dialed everything in. On most productions, you somewhat have to “pack the chute on the way down” – fixing shaders and tweaking rigs; it’s just the nature of a business with often demanding deadlines. But at this point, the characters had run the gauntlet, twice, and were really dialed in. That was probably the biggest production boon to this show; everything was tried, tested and perfected. We were working with production ready assets – a rarity in VFX – and that meant more time to craft the animation and lighting.
Did you use any new software?
We used MARI for the first time to model the Atronach flesh beast and the wood elves in ‘The Arrival.’ Our CG modeling supervisor Mathieu Aerni was able to texture a character while seeing the exact result in 3D, which is much faster than painting on a flat UV then reloading. Since MARI uses layer-based philosophy like Photoshop, Aerni was able to blend modes and manage large textures in real-time. The Artonach is 32 feet tall, so most shots are close ups. To get the required resolution, Aerni created hand painted textures in 8K using MARI’s default Organic Brushes set, and then was able to display those textures, with reflections and glossiness maps, accurately in real-time using MARI’s viewport.
Would you want to create in-game or real-time cinematics?
We’ve often been approached to do this sort of work and it’s definitely an aspect of production we’re keeping a close eye on. There are many benefits of incorporating a fast visualizing workflow into our production pipeline. In the past, we’ve explored options of how to fit it into what we do, and sooner or later, we're sure it’s going to happen. What games are now able to do in real-time is quite incredible, and making use of those technical advancements to help tell our stories, one way or the other, whether it’s highly advanced previz or possibly fully realized trailers is, in my opinion, inevitable.
Vote now on this year's CG Awards for your favourite CG video game promotion. Voting closes 28 July.
Thank you for reading 5 articles this month* Join now for unlimited access
Enjoy your first month for just £1 / $1 / €1
*Read 5 free articles per month without a subscription
Join now for unlimited access
Try first month for just £1 / $1 / €1
Get the Creative Bloq Newsletter
Daily design news, reviews, how-tos and more, as picked by the editors.
The Creative Bloq team is made up of a group of design fans, and has changed and evolved since Creative Bloq began back in 2012. The current website team consists of eight full-time members of staff: Editor Georgia Coggan, Deputy Editor Rosie Hilder, Ecommerce Editor Beren Neale, Senior News Editor Daniel Piper, Editor, Digital Art and 3D Ian Dean, Tech Reviews Editor Erlingur Einarsson and Ecommerce Writer Beth Nicholls and Staff Writer Natalie Fear, as well as a roster of freelancers from around the world. The 3D World and ImagineFX magazine teams also pitch in, ensuring that content from 3D World and ImagineFX is represented on Creative Bloq.
Related articles
- Why Dragon Quest III HD-2D Remake's 'zombie tech' is the future of retro-gaming
- Why Sword of the Sea's animation looks so silky smooth
- Elden Ring locations recreated in Minecraft are stunningly detailed
- Alien: Rogue Incursion's terrifying Xenomorphs are a “happy accident” reveals the game's art director