THE SCENARIO

If Someone on TV Looks Perfect, They Probably Had “Digital Work” Done

Us normies may have just discovered the unsettling power of the Bold Glamour filter, but Hollywood's been using similar tech for years. 
photo illustration depicting a woman on a tv screen with her face pixelated
Getty Images/Bella Geraci

In The Scenario, reporter Kirbie Johnson takes readers behind the scenes of the buzziest movies and TV shows to reveal how the best wigs, special effects makeup, and more are created. For this edition, Johnson dives deep into the prevalence of digital face filters in Hollywood. 

When I hear the term “digital makeup,” Clueless-inspired application technology comes to mind. Like Cher swiping through clothing options on her computer and retrieving them from her mechanical closet, I select my ideal look by scrolling through shades of blush, picking between luminous or matte skin, bumping up the thickness of my lashes. Each option presents itself on a digital scan of my mug. With a press of a button, I sit in a director’s chair while some magical makeup machine, akin to those robotic arms Beyoncé utilizes during her performance of "Heated," administers all of these elements to my face with precision and ease in under 10 minutes.

We aren’t there yet. Instead, “digital makeup” most recognizably exists as social media filters, which won’t help in real life but can let you skip any product application before filming on your phone. There are apps that can create everything I said above — fuller lashes, carved cheekbones, less shine — plus incorporate cosmetic changes like a smaller nose to a smaller head. (I’ve joked that celebrities who look like their heads have been “pinched smaller” are likely using these apps to have baby-like proportions.) Filters have gone from just okay — obviously fake lashes that glitch with each movement, blindingly-white eyeballs, skin smoothed and contoured to a superficial, mannequin-like texture — to virtually undetectable. It might as well be how you look in real life. But it’s not.

TikTok filters like Bold Glamour and Lite Foundation had people in a tizzy mostly because, unlike filters we’re used to, these filters use nearly undetectable technology to change you into the hottest version of yourself. (At least in theory: Bold Glamour actually makes me look like Handsome Squidward. Beauty standards are subjective, y’all!) TikTok isn’t the only place we’re seeing these realistic digital tweakments: Zoom offers a “pretty” filter to clean up the skin (although it’s fairly obviously a filter) and even our iPhones beautify our likeness.

According to The Verge, these eerily imperceptible filters likely utilize a form of artificial intelligence called GAN (General Adversarial Network), a neural network. (If you're staring at this blinking rapidly, stay with me.) Of all places, Amazon explains this quite nicely: Neural networks are a “method in artificial intelligence that teaches computers to process data in a way that is inspired by the human brain.”

Matt Panousis, chief operation officer and partner at Toronto-based visual effects company and AI startup Monsters Aliens Robots Zombies (MARZ), explains that the filters we are used to seeing may blur or brighten but are often obvious in your final shot. “These filters then use standard face tracking to follow the user’s movement,” says Panousis. “When you look closer, you will see that you are losing a lot of detail like pore texture. You will also see that face tracking will fail when the movement is too fast or extreme, or when lighting changes, and when the face is partially obscured which we call occlusion.”

GAN, on the other hand, essentially morphs competing images — for example, your own face and the Bold Glamour filter — into one, versus simply overlaying a mask onto the face.

The discussion around GAN filters has ranged from praise for the technological advances to deep concern about what those advances mean for the real world. Comparing ourselves to photoshopped images in magazines and advertisements was tiresome enough, but at least deep down we knew the people in those photos were celebrities or models whose entire careers revolved around being better than average looking. Now we’re also trying to live up to photos and videos on social media, which offers easy access for normies to these types of editing tools that were previously reserved for a specific subset of humanity. Could this proliferate to a point where we have both a “digitally optimized” look that varies greatly from how we look in the real world? (Perhaps it could be argued that this is already rampant. Just the other day someone told me, “I’m so glad to meet you. You look just like your photos!” It was a bizarre comment that I couldn’t help but find flattering in this current landscape.)

While everyone is up in arms about the digital filters their favorite influencers are using, what’s less discussed is how often similar technology has been widely used in film and television. Of course, Hollywood standards differ from social media; most people watching movies are aware they aren’t witnessing real life events. But similar to their social media filter counterparts, this type of technology — especially when it isn’t widely disclosed — can proliferate unattainable standards.

In 2021, I spoke with visual effects artist Rod Maxwell for an Allure story on how special makeup effects worked with digital effects like CGI. Maxwell relayed that in his line of work, he engages in a digital dermatology session during post production. I reached back out to him this year for further explanation.

“For our modern day audiences, films and television are akin to covers of magazines where women and men go, ‘That's not fair. I can't look like that.’ And then you talk to the real actors or actresses or models and they go, ‘I don't even look like that!’” says Maxwell.

Some edits are made to keep the audience from being distracted. For instance, if an unplanned pimple or cold sore pops up on the day of shooting, it'll probably be edited out in post production. “The audience is going to think, ‘That means something, that's a clue!’ A blemish could be part of the character design,” says Maxwell, noting that if it’s not, it could mislead the audience if it’s not removed. Of course, there are more extreme digital alterations that can take place, such a de-aging for a role, but typically Maxwell is making tiny tweaks to help actors look more well-rested or “better-moisturized.”

To make someone look less tired, Maxwell says it’s about digitally editing the lighting. Harsh lighting with a severe contrast can accentuate dark circles, lines, and pigmentation on the skin. Even if that’s the type of lighting that was on set, Maxwell can create nice, even lighting in post production, making sure it matches the actor's environment in the scene so it doesn’t look distracting or out of place.

Like any special effect, whether or not a production is able to utilize this type of digital touch up typically comes down to time and budget. However, Vanity AI, an AI-powered visual effects tool launched by MARZ, is meant to help mitigate both of those problems in television, offering a cost-effective solution that provides VFX quickly, delivering (according to the brand, at least) film-quality on a smaller budget, and in minutes versus days. Panousis shares that VanityAI is able to cut down on costs and turnaround times by offering large volumes of high-end 2D aging, de-aging, cosmetic, wig, and prosthetic fixes. By the company’s claims, the tech, which has been used in series like Stranger Things 4 and films like Spiderman: No Way Home and Being the Ricardos, is up to 300 times faster than traditional VFX pipelines, significantly more cost effective, and has no capacity constraints — meaning it can accelerate production timelines no matter when it’s utilized in the process.

Panousis acknowledges that AI isn’t a perfect tool, but MARZ aims to empower artists with “next generation AI technology that allows them to do work in a fraction of the time for a fraction of the cost.” When the idea for Vanity AI first started coming to fruition in 2019, the company decided to focus on a visual effects problem that is encountered on most projects: de-aging/aging/cosmetic.

“In practice, I can take people pretty convincingly and very naturally about 15 years on either side of where they are today and can do any level of cosmetic work on them,” says Panousis. “The software can accommodate any kind of textural change — if you want eyebags removed, or crow's feet removed, or forehead lines removed, or laugh lines removed, or marionette lines removed.” He can also add those features, if the actor needs to be aged up.

The software works to make these changes, but doesn’t lose other details in the process. For example, “I want to get rid of a line, but I don't want to lose pore detail, I don't want to look like I've been airbrushed,” says Panousis. “The word ‘natural’ was really important for us because at the end of the day, celebrities are people too. They have imperfections. They don't want to look like they've been touched up. [VanityAI] gives the [visual effects] artist a lot of control and it's highly automated, but the output itself is very, very natural.” 

Panousis can’t speak about the shows that have used VanityAI when it comes to cosmetic fixes like line or blemish removal and texture smoothing, as VanityAI is hired as a visual effects vendor and certain productions may have protections in place, like NDAs with actors who might prefer that their audience not be clued in to all these digital touch-ups. However, de-aging is a less sensitive topic: VanityAI was utilized to de-age a past villain in Spiderman: No Way Home as well as for flashback sequences in shows like Dr. Death and The Walking Dead.

Based on the years of research MARZ did before launching the product, Panousis estimates that about 80% of projects will leverage some form of this type of digital cosmetic work. “That’s not to say they'll do hardcore aging or de-aging on 80% of show, but they're doing some level of cosmetic digital makeup on those projects.” VanityAI exists because there was a whitespace for it within the industry, although Panousis says he’s not confident in claiming how prevalent digital makeup is currently because nobody wants to discuss these post-production tweaks they’ve had done or completed on the record. (I can testify to this — getting visual effects artists to speak about their work for this story felt akin to asking a physician to violate their patients’ HIPAA rights.)

But rest-assured, if someone looks impossibly perfect on TV, there’s a good chance they’ve gotten a little digital work done.


Read more from The Scenario:

How They Made Harrison Ford Look 40 Years Younger in Dial of Destiny

Meet the Team Making All of Hollywood's Most Realistic Prosthetic Penises

All the Subtle Barbie Beauty References You Might Have Missed


Now, some more behind-the-scenes secrets: 

Follow Allure on Instagram and TikTok, or subscribe to our newsletter to stay up to date on all things beauty.