On Actors and Deepfakes
How professional actors can win using technology, and what to guard against.
In March, a?deepfake?video with a message of surrender was uploaded to a hacked Ukrainian website. This was the first real attempt at using a deepfake for purposes of terrorism and political gain. There will be many more to come.
Citizens were rightfully terrified, and urged their governments to restrict use of AI-generated “synthetic media” — while bans might not even be possible or effective [1]. In fact, the best way to fight back against higher-quality attempts will be actors taking control over powerful AI tools while defending their likeness.
Is this new? Should we be alarmed? Or does society grow accustomed to fakes — just like?photoshopped images?and?fake news? Do we ban the printing press or Adobe suite? In a few years, people will be able to make deepfakes quickly and easily on their phone. There’s no way it can be completely controlled, but it can be benefitted from and observed. The term already has negative connotations based on misuse, with it lately rebranded as “digital humans” and “virtual acting”.
Costumes, make-up, and wigs?have been a part of theater since antiquity. The ancient Greeks didn’t allow women to perform, so men would dress up for the roles. Centuries later, filmmakers used?“special” visual effects (VFX)?to create the impossible [2]. With the advent of digital film,?Computer-Generated Imagery?(CGI) expanded the realm of possibilities — prominently portraying dinosaurs in Jurassic Park — still holding up after 30 years.
If audiences don’t need to strain their imagination, it’s easier to get lost in the story.
Deepfakes take CGI and imitation to the next level, powered by AI and?“Deep Learning” (forming the portmanteau “deepfake”). Previously, only talented and rare creators at shops like?ILM?could produce CGI and “digitally-portrayed” acting. On powerful new computers these creators painstakingly modeled each frame — combining artistry and engineering talents. Directors and Producers work with them closely to find the right shots, pioneered by technologists like Steve Jobs, Ed Catmull, and John Lasseter at Pixar.
AI now leverages video data to easily recreate people in forms they’ve never undertaken. Almost anyone can soon show hyper-realistic camera quality video of people saying and doing anything, with Big Tech and authorities monitoring and removing some from the Internet already. However, lengthy content tends to suffer from?“uncanny valley” —?brains recognizing and detesting fakes. Many films failed at this limitation, see?The Polar Express?and?Beowulf.
Can actors and filmmakers benefit from this technology? Can they use their own likeness and data to help detect and flag fake videos? We think so, and hope AI can help bring positive impacts to this powerful new tool.
Deepfakes can be made "one-shot" from a single image, even a painting
Recently on the set of?Rust?with star?Alec Baldwin, an accident involving a prop gun caused the death of cinematographer Halyna Hutchins and injured director Joel Souza. Production was suspended [3]. Many were reminded of the untimely death of?Brandon Lee?on the set of?The Crow.
Productions sometimes require action sequences and dangerous?stunts?that are critical to the plot. Unfortunately the history of cinema is haunted by incidents where brave stunt-actors have lost their lives [4]. Deepfakes can help synthesize video scenes reducing the need to perform these risky maneuvers. Just like the famous example below of an actor imitating Tom Cruise — you can put one person’s face on another [5] for up-close stunt re-enactments, and soon it won’t even require the body double. Actors can use their likeness to enact stunts without putting themselves in harms way.
They can record their lines as usual, then tweak and apply digital changes however they need. Anyone can play any character, and this is powerful for diversity and the democratization of the industry. It was not always possible for everyone to play anything — despite the desire and talent to do so. Recently in theater, there were barriers broken in casting the 2015 hit play Hamilton. Soon gatekeepers won’t have any reason not to cast diverse portrayals, with the ability to blend likeness in any way they want to improve the believability. Many actors may prefer?not?to use AI for years to come, and that’s fine too.
Deepfakes can allow actors to play a variety of roles and scenes, digitally faking the ones they are not comfortable with, and this includes both stunts and sex scenes [6]. The movies will be better for them, showing the real actors likeness and mannerisms, but still matching the artistic vision of the picture. Some actors have refused?nude scenes, preferring instead for the script to be edited, a body double used, or even turned down the role. This doesn’t benefit anyone, and deepfakes can help.
Unfortunately, this technology also makes it possible for unsanctioned, illicit,?synthetic pornography?to be generated (especially for famous actors with hundreds of hours of readily-accessible digital video data public to the world through their films). This is something for everyone to watch out for and is already on the radar of authorities. Governments are enacting laws prohibiting sharing of others unsanctioned nude photographs.
As with detecting political deepfakes however, the best way to detect and remove this content from the public Internet will be using a form of AI automation, to detect and flag it automatically across the Web. Unlike other dangerous and powerful technologies such as nuclear weapons, due to the nature and portability of software this isn’t something a government can?fully?monitor or restrict from bad actors, and it’s inevitable some will continue it’s use unrestricted.
Going Back In Time and Around The World
Life happens. Filming begins, and near the end — for whatever reason — an actor passes and can’t continue their legacy. Do you think their last wish would be to scrap the feature, or would they want production to finish and for the world to see their last acting role? Paul Walker died in a car crash before?Furious 7?went out, so his brother honored his acting legacy and played as body double, deepfaking the final unshot scenes. This was easier to do before AI due to their physical likeness, which isn't always an option. Carrie Fisher passed away during production of the recent?Star Wars. CGI and deepfakes helped finish their final work. Similarly, as actors age they have been limited from roles. Robert DeNiro, Will Smith, and Arnold Schwarzenegger have all used?de-aging?to synthesize shots for?The Irishman, Gemini Man, and?Terminator 4. This especially beats the uncanny valley in quality, due to the person themself acting as “body double” and saying the real lines. Movie plots from?Tron?to?Gemini Man?have been made possible, featuring both the aged actor and a separate de-aged younger character generated in scenes together.
The hit show?Euphoria?received some criticism recently for its portrayal of teenagers and drug use (see Sydney Sweeney from cover photo in references below). Actors are sometimes subject to enacting fictional scenes of rape and violence, which can be challenging especially for minors filming in person.
In some cases, we want and need to tell graphic stories on film, so this tool should be available to actors who prefer to use it. Tools need to benefit the user, and we think AI can be a great tool to help actors create great films. Not everyone in Hollywood gets job offers, especially new or struggling actors fighting to break in to Hollywood who might be forced to accept roles including scenes they are not comfortable with. Again, virtual acting with CGI can help them land those roles, and create a body of great artistic film work for the world.
Finally, we also have the option to show our films in other languages for audiences worldwide. Not all will want to change the original language audio, and will continue using subtitles - and that’s fine if best for the experience. Product commercials can already be effectively?localized?— with an authentic actor representation [7]— to any language in any country with no re-shoots (and even using different brand names “said” by the original actor). This is another great benefit and will become a powerful revenue generator for studios and media companies.
Filmmakers and actors can leverage AI?translation?to reach more audiences, even changing?lip movements to match.
Not everyone can have a perfect deepfake. There are imperfect-quality?AR?apps that paint your face on others (e.g.?Snap’s virtual masks, and filters on TikTok or Zoom). The reason?deepfakes?of Tom Cruise are possible today (without consent or legal recourse) is due to many hours of?video data?readily available to the public.
So are you safe from deepfakes? Mostly yes for now, but public figures are going to be subject to the positives and negatives mentioned here.?We hope to help aspiring actors and established stars alike, to profit from this game-changing new technology.
Be sure to read Part 2 in this series On Actors on Deepfakes:
领英推荐
References
Cover Photo - Sydney Sweeney Was “Grossed Out” Filming “Euphoria” Season 2 Hot Tub Scene - Sara Delgado (Teen Vogue - Photograph by Eddy Chen / HBO all rights reserved)
Inline images and videos:
Footnotes and citations:
[1] Chapter 5 “Security and World Order” —?The Age of AI And Our Human Future?(Kissinger, Schmidt, Huttenlocher)
[2] Special Effects (SFX), Visual Effects (VFX), CGI, and more
[3] “Rust” shooting incident?involving Alec Baldwin and a prop gun
[4] 20 Movies and TV Shows Where Stunt Actors Died During Filming?(Newsweek)
[5] More on Tom Cruise deepfake?(Chris Umé —?Metaphysic.ai)
[6] More examples of actors scarred filming graphic scenes:
Actors refusing nudity and sex scenes for various personal reasons:
[7] Marketing for JustEat featuring Snoop Dogg, using deepfake video and audio translation technology to cross markets and languages:
????
COPYRIGHT ? 2021-22 CYBERFILM.AI CORPORATION
Engagement Lead @ inVerita | Bachelor's in Accounting
2 年Russell, thanks for sharing!