The 2023 Mrs. Doubtfire might have had Harvey Fierstein analyzing Robin Williams’ face in his makeup chair, going “Oh honey, let me get my iPhone – we’re gonna need Metahuman Animator for this.” 

That frankly sounds terrible – much too easy. Mrs. Doubtfire would have been outfitted with a 3D-printed hyperrealistic mask based on a perfect scan in no time. I like my face masks to be toilsome and filled with buckets of plaster. (On a side note, do you realize that there is an entire generation of movie watchers growing up without Robin Williams?!)

Unreal Engine released the Metahuman Animator pipeline for facial performance capture the second week of June, and the reactions have been positive if not captivated. Epic initially unveiled the tool at the 2023 Game Developer’s Conference, and it’s now available as a free plug-in download on the Unreal site. So, in addition to allowing creators to build humans with a custom mesh in Mesh to Metahuman, Unreal has enabled creators with the ability to capture an actor’s performance and translate it seamlessly to facial animation.   

For authentic, non-PR-glossed reactions to Metahuman Animator, we go down to the grey cobbled streets and dark alleys of the internet: the Youtube comment section.

Metahuman Animator was first showcased with the short film Blue Dot by 3Lateral

3Lateral is Epic Games’ in-house consulting and pipeline development team for facial animation. They teamed up with Serbian actor Radivoj Bukvić and director and cinematographer Ivan Šijak to create a film that encases Metahuman Animator in four minutes of atmosphere and poetry. The performance capture is impressive – Bukvić’s chin shading is so defined that you know it’s about 12 hours of stubble growth, and his microexpressions are depicted accurately in a stoic heaviness with a trace of regret. The overall viewer consensus was along the lines of, “Epic has killed the uncanny valley.” User  andreasschultze6644 said:

I’m a professional 3D Artist and this is so extremely impressive and frightening at the same time. You struggle really hard to keep up with the speed of development. It’s just crazy how good that looks in every way. Lighting, Animation, Skin slide, Hair, Sound… just everything is on another level.

The “backlash” to Metahuman Animator seems largely manufactured. The Youtube channel Enfant Terrible, with its 131k subscribers, posted the Blue Dot film with the title “New MetaHuman Animator is TOO REALISTIC in UNREAL ENGINE 5.2.” That’s pure clickbait – scroll through the comments and even Enfant Terrible’s description and you’ll be hard-pressed to find negativity. The most common criticism is this: the articulating mouth has the slightest degree of stiffness to it. 

Creators are starting to post their experiments for the world

It’s honestly surprising that Metahuman Animator hasn’t become a TikTok trend – you don’t need to be a professional game developer to master its functions and the scanning is all done with an iPhone. It’s fun to look at how users are playing with the tool but it also provides a more dynamic view of the expressiveness of the translated mannerisms. Here’s an original video posted by JSFILMZ:

The immediate instinct is to sit here and try to find flaws in the animation. It’s tough, right? Maybe, maybe some rigidity in the cheeks. Slight rigidity.

Creators will be Mrs. Doubtfirin’ all summer long

“Doubtfirin’” totally works as a verb. This is the digital equivalent of putting an actor in a makeup chair and telling them you’re going to build them a face that’s more realistic than they are. 

For more information about using Metahuman Animator, you can read the Unreal Engine press release, or log in to your Unreal account and do what any growth-oriented creator does:

Start learning.