Growing up in Paris, I pretty much learned English by doing two simple things: binge-watching Netflix and listening to music. At the time, picking translated movies or shows over their original language resulted in slight shame—make an effort Alma if you want to move to London—as well as a considerable amount of irritation throughout the selected entertainment. You know what I’m talking about here; no matter how good a translation is, actors’ lips are usually out of sync and it just isn’t the same.
Enters London-based AI startup Flawless, which claims its deepfake technology could make dubbed movies (and shows, I assume) look way more natural. How? According to the company, its technology fixes the out-of-sync translation by creating mouth movements that match the spoken translation, then slapping them over the original image.
Named ‘TrueSync’, the deepfake tech is the world’s first system that uses AI to create perfectly lip-synced visualisations in multiple languages. “At the heart of the system is a performance preservation engine which captures all the nuance and emotions of the original material,” reads Flawless’ website.
The startup’s co-founder Nick Lynes tells The Verge that this process retains the movie’s original style and performance. Although the end result isn’t 100 per cent perfect, it’s pretty close. And Flawless says it can offer it quickly, cheaply, and in any language.
It’s also easier than a complete do-over, like Metástasis, the Colombian telenovela-style remake of Breaking Bad that doesn’t exactly replicate the performance that won Bryan Cranston four Emmys…
Now, I can already hear some of you saying they prefer the authenticity of subtitles—I’m looking at you, movie snobs. But look at it this way; while subtitles help those who are deaf or hard of hearing, dubbing helps those who are blind or have low vision. Still ready to stand by your pretentious argument now?
In fact, most people prefer the ‘lazy’ way out. In 2018, the streaming giant Netflix found that people were more likely to watch a dubbed show than one with subtitles, which is why it’s made the dubbed version the default. The company is now working with over 170 studios worldwide that offer dubs in more than 35 languages, according to Bloomberg. In fact, Lupin, its number one show this quarter, is a French-dubbed work.
Flawless’ deepfake tech could reshape the movie industry, in both alluring and troubling ways. It promises to allow directors to effectively reshoot movies in different languages, making foreign versions less jarring for audiences and more faithful to the original. But the power to automatically alter an actor’s face so easily might also prove controversial if not used carefully.
Soon enough, the AI dubbing technology will be invisible. People will be watching something and they won’t realise it was originally shot in another language. While this sounds pretty exciting for movie snobs, it highlights the augmented risk deepfakes could represent in the near future.
After all, the same technology has been used to create fake celebrity porn (also known as deepnudes) and damaging revenge porn clips targeting women. Experts worry that deepfakes showing a famous person in a compromising situation might help spread misinformation and even sway an election. Is all of this really worth it in exchange for near-perfect dubbing?