Publisher Nordcurrent Labs
Developer River End Games
Platform Windows PC (Steam, Epic Games Store), PlayStation 5, Xbox Series X|S
Format Unreal Engine 5
Release date 2025
Unreal Engine 5 has been helping small game dev teams achieve great things since it released, but with 2021's release of MetaHuman Creator followed by a demo at GDC 2023 for MetaHuman Animator, many devs found a new tech that can really help them achieve Triple-A results in less time.
MetaHuman puts tools in the hands of small indie developers that are usually the preserve of large VFX studios. The tool is part of Epic Games' free MetaHuman plugin for Unreal Engine and can transfer facial performance from an actor to a MetaHuman character model. Uniquely, it works with many facial camera systems, including iPhone, making it a good choice for small teams.
The technology was demoed in Senua's Saga: Hellblade II and impressed, but since then more studios have adopted the technology, including River End Games, developer of upcoming isometric, narrative-driven stealth game Eriksholm: The Stolen Dream. The team is made up of developers whose previous games include Battlefield, Little Nightmares and Mirror’s Edge, and MetaHuman Animator has impressed.
Below Anders Hejdenberg, founder and creative director of River End Games, explains how using MetaHuman Animator, which works with the best 3D modelling software, has enabled the small team of 15-17 developers to elevate a game’s storytelling and deliver Triple-A quality performances.
CB: Why choose Unreal Engine 5 and MetaHuman for this game?
Anders Hejdenberg: When we began pre-production of this game, the only two game engines that were viable options were Unreal Engine and Unity. At that point in time, Unity had yet to implement PBR in its rendering pipeline, so Unreal Engine became the natural choice.
MetaHuman wasn’t available at this point, so it wasn’t until later that we adopted the technology. Before MetaHuman, we had another pipeline in place with a face-scanning rig that we built ourselves. The results were really good, but it was a very slow and tedious process.
CB: How does MetaHuman help indie developers and smaller teams achieve high-quality character design?
AH: It’s a very powerful tool that allows for a lot of customisation, but this is only part of the equation. An aspect that is easily overlooked is just how difficult it can be to create realistic shaders for skin, eyes, teeth, and hair – but all of these are included in the package.
Epic has done all that work for you, which would be very difficult to do yourself unless you’re already an expert. Not to mention the rig setup to control the blends between facial expressions, and how that is tied into blends between normal maps and textures.
The facial rig setup that comes with MetaHuman is of a very high standard, so you don’t have to be an expert there either. All of this combined makes MetaHuman an extremely useful tool, especially for smaller teams that don’t necessarily have the resources or know-how to do those things themselves.
CB: What storytelling or performance challenges did MetaHuman solve for this game?
AH: MetaHuman Creator allowed us to quickly create believable characters with a very wide range of facial expressions. But the other part of the equation is MetaHuman Animator that does a tremendous job of capturing our actors’ performances and applying it as animation data to the facial rig.
Before MetaHuman Animator, there was quite a wide gap in terms of capability in software that’s available to the public and software being used at VFX companies. It essentially meant that you either had to do a great deal of hand animation to get good results, or you had to write your own software - which is what a lot of VFX companies have done. But MetaHuman Animator changed that in a very significant way.
CB: How has MetaHuman with UE5 tools like Lumen and Nanite helped the immersion of the game?
AH: For performance reasons we use baked lighting in Eriksholm, but for our cinematics we use Lumen since those are pre-rendered movies. Using Lumen in our cinematics allows us to tweak or re-arrange the lighting for each shot in a scene dynamically, which is a very useful process in order to get each shot looking as good as possible.
CB: How much creative control does MetaHuman offer for customisation, to make sure characters fit the unique narrative of your game?
AH: There are a lot of customisation options in MetaHuman Creator, but you can also use the Mesh to MetaHuman feature to make small or large adjustments as well.
CB: Can you share examples of how MetaHuman enabled you to create nuanced character expressions and performances?
AH: The fact that MetaHuman Animator’s solve is not based on just 2D video data, but 3D mesh data generated from stereo video input, means that you get a much better solve where very subtle nuances of the actor’s performance are captured in a very convincing way.
CB: Did you face any limitations when using MetaHuman? How did you overcome them?
AH: The only problems we’ve encountered so far have been related to creating very young characters, where there’s not a whole lot in the database to work from. But we manage to overcome it by blending multiple types of characters together, and using the Mesh to MetaHuman feature for additional tweaks to proportions.
CB: Before MetaHuman how much longer would this kind of character creation take, and could you do it with your team size?
AH: I would say that it would probably take twenty times longer to create characters with our previous pipeline. But it’s not just that – with our previous pipeline we would scan real people, which means that you also have to find those people and negotiate contracts with them.
There’s also the aspect that once you had a finished character, it was a very tedious process to make adjustments to it – so you really had to stick with what you had. With MetaHuman Creator on the other hand, you can try a character out in a scene – and if you don’t like a certain aspect about it, you can just change it in the tool.
Even though we’re just 17 people on the team, we would’ve been able to make it with our previous pipeline – but now with MetaHuman we could spend that time and effort on the game itself instead.
CB: Are there any other innovations you’re looking forward to in 2025, or making use of? Are their AI tools like Nvidia ACE you’d ever consider using?
AH: What I’m looking forward to the most is not necessarily technical innovations, but rather new ways of thinking about a game’s story, design, characters and environments to make the experience even more meaningful to the player. I love that there are always new frontiers to explore when it comes to the player’s experience.
CB: How do you see MetaHuman evolving in the future and what new possibilities do you see for small dev teams?
AH: I think that we’ve come a really long way the last ten years in terms of democratising the means of production. You don’t have to have a large team and a huge budget to make something truly great anymore, because there are so many tools readily available to the public. MetaHuman is a great example of this, and part of a trend that I think will continue. A future that is much more about coming up with interesting ideas than overcoming barriers of entry.
Get the Creative Bloq Newsletter
Daily design news, reviews, how-tos and more, as picked by the editors.
Thank you for reading 5 articles this month* Join now for unlimited access
Enjoy your first month for just £1 / $1 / €1
*Read 5 free articles per month without a subscription
Join now for unlimited access
Try first month for just £1 / $1 / €1
Ian Dean is Editor, Digital Arts & 3D at Creative Bloq, and the former editor of many leading magazines. These titles included ImagineFX, 3D World and video game titles Play and Official PlayStation Magazine. Ian launched Xbox magazine X360 and edited PlayStation World. For Creative Bloq, Ian combines his experiences to bring the latest news on digital art, VFX and video games and tech, and in his spare time he doodles in Procreate, ArtRage, and Rebelle while finding time to play Xbox and PS5.