TikTok The creator of the profound Tom Cruise: public should not worry about ‘one-click fake’

When a series of hauntingly compelling Tom Cruise went viral on TikTok, some suggested it was a huge sign of the future – a harbinger of an era in which AI would allow anyone to make fake videos of anyone else. However, the creator of the video, Chris Ume, the Belgian VFX specialist, says this is far from the case. Talk to The edge Over his viral clips, Ume emphasizes the amount of time and effort involved in making each deep fake, as well as the importance of working with top pilot Tom Cruise impersonator Miles Fisher.

“You can’t do it just by pressing a button, ‘says Ume. “It’s important, it’s a message I want to tell people.” Each grip took weeks of work, he says, using the open source DeepFaceLab algorithm as well as established video editing tools. ‘Combining traditional CGI and VFX with deepfakes makes it better. I make sure you do not see any of the errors. ‘

Ume has been working with deepfakes for years, and also creates the effects for the “Sassy Justice” series made by South Parksee Trey Parker and Matt Stone. He started working on Cruise when he saw a video of Fisher announcing a fictitious election for the Hollywood star’s president. The couple then worked together on a sequel and decided to post a series of “harmless” tracks on TikTok. Their account, @deeptomcruise, quickly picked up tens of thousands of followers and likes before Ume pulled the videos a few days ago.

“It has achieved its purpose,” he said of the report. “We had fun. I created awareness. I showed my skills. We made people smile. And that’s it, the project is over,” a TikTok spokesman said. The edge that the account was well within the rules for the parody use of deepfakes, and Ume notes that Cruise – the real Tom Cruise – has since made his own official account, perhaps due to the sight of his AI double.

Deepfake technology has been evolving for years now, and there is no doubt that the results are becoming more realistic and easier to make. Although there is much speculation about the potential damage that such technology could cause in politics, these consequences have so far been relatively untenable. Where the technology is definitely harming is the creation of revenge porn or non-consensual pornography of women. In those cases, the fake videos or images do not have to be realistic to cause tremendous damage. Simply threatening someone with the release of fake images, or creating rumors about the existence of such content, can be enough to destroy reputations and careers.

However, the Tom Cruise counterfeits show a much more advantageous use of the technology: as another part of the CGI toolkit. Ume says there are so many uses for deepfakes, from copying actors in film and TV, to restoring old footage to animating CGI characters. What he emphasizes, however, is the incompleteness of the technology that works on its own.

Creating the fakes took two months to train the basic AI models (using some NVIDIA RTX 8000 GPUs) on the Cruise recording, and days of further processing for each clip. After that, Ume had to go through each video frame by frame and make small adjustments to sell the overall effect; smooth a line here and cover a bug there. “The hardest thing is to make it look alive,” he says. “You can face it if it’s not right.”

A lot of credit goes to Fisher, says Ume, who captured Cruise’s exaggerated mannerisms, from his manic laughter to his intense delivery. “He’s a very talented actor,” says Ume. “I just do the visuals.” Even then, if you look closely, you can still see moments where the illusion fails, as in the clip below where Fisher’s eyes and mouth blink for a moment as he puts on his sunglasses.


Cut and you will miss it: look nice and you can see Fisher’s mouth and eye piercing.
POISON: The edge

Although Ume’s point is that his deep fakes require a lot of work and is a professional imitator, it is also clear that the technology shall improve over time. It’s hard to predict exactly how easy it is to make seamless fakes, and experts are developing tools that can automatically identify fakes or confirm unedited footage.

However, Ume says he is not too worried about the future. We have developed such technology before and society’s conception of truth has more or less survived. “It’s like Photoshop 20 years ago, people did not know what photo editing was, and now they know about these fakes,” he says. As deepfakes become more and more a staple in TV and movies, people’s expectations will change, as they did in the days of Photoshop. One thing is for sure, says Ume, and that is that the genius cannot be put back in the bottle. “Deepfakes is here to stay,” he says. “Everyone believes in it.”

Source