
With the technological advances of today applied to the cinema, television or advertising itself, incredible things can be achieved. These advances and the latest movements in the audiovisual sector have made it fashionable again to talk about totally false elements created by computer, replacement of faces or even rejuvenating actors so that they appear to be 30 years younger. Today we want to talk to you, and clarify certain doubts, about the two most popular techniques in this sector: CGI and deepfakes .
CGI vs DeepFake: what are they and how do they differ

Despite the fact that in recent years talking about this topic has become very popular, these techniques (especially CGI) is something that has been used in film, TV and advertising for a long time. But of course, the public’s ignorance of these techniques can usually lead to confusion or even to think that they are only “things done by computer.” The truth is that they are totally different elements, but they can be complementary.
On the one hand we have the CGI technique or “Computer Generated Imagery”, also known as computer generated images. These types of graphics, whether in 3D or 2D, are often used in art, movies, television programs and advertisements or video games.
With reference to film and television, CGI is often used to recreate scenes that, in many cases, would be much more expensive to create in real life than to generate them through advanced computing techniques with computers. But, taking it to the extreme, there are scenes that if it were not for CGI there are scenes that would not be possible to obtain otherwise, such as, for example, that a deceased actor or actress appears in a movie or series. An example of CGI could be the computer generated graphics for the popular series The Mandalorian.
On the other hand, the Deepfake technique is an acronym in English formed by the union of the word fake (“false” in Spanish) and deep learning (“deep learning” in Spanish). This is a technique performed by artificial intelligence computing that is often used in film or television mainly to replace faces, rejuvenate them or age them in an ultra-realistic way. It is true that it is, once again, something made by computer but this technique requires 2 real sources to use.
So that we all understand it well and explained in a very simple way, through the use of AI and facial recognition , all the features of a model “A” are analyzed to which we want, for example, to replace the face with that of another person. Now with our model “B”, whose face has been selected for replacement, we need to create a “database” of each feature of his face either with a video or through many photographs with different angles. Once all the morphological data has been extracted from the face of subject B, an alignment of the facial features of subject A is carried out to “fix” his face at all possible angles. Once the features of both models are known, the alignment of both faces is carried out using computer computing techniques, obtaining as a result something similar to a mask on the face of subject A. To finish the task, although this entire process is much more profound , the composition and vfx are used to “eliminate the excess”.
The final result , if it has been done correctly, is that character A is tremendously similar (in terms of the morphology of his face) to B. An example of deepfake could be the one in the following image, in which it has been replaced the face of Jack Nicholson in “The Shining” by the one of the actor Jim Carrey.

The most popular CGI and Deekfake
Now that you know a little more about these popular techniques used in film, television and many other media, we want to show you some of the most popular CGI and deepfakes of the moment. Some have been quite criticized and now we explain why.
The criticized CGI of The Mandalorian

As we have already mentioned a few lines ago, using the CGI technique in the cinema to implement characters is the order of the day. And they did so in the last episode of The Mandalorian 2 with the reappearance of a very young Luke Skywalker .
There is no official statement from Disney that explains the procedures followed for these scenes, but what is evident is that they were not as polished as they should be (and more so knowing the budgets that are handled).
After the euphoria of the fans after seeing one of their favorite characters “brought back to life” with the appearance that he had in the 70s, criticism of the unrealism that these scenes contained did not take long to appear.
Seizing the moment, the creators of Sam & Niko , a YouTube channel specializing in video editing and special effects, decided to try to improve these scenes and they did.
As they explain in one of their videos, they thoroughly analyzed the movements and facial features (along with the lighting errors and textures that they comment on the clips in which Luke appears). Then, pulling the archive, they analyzed a large number of videos and photographs of the character when he had the desired appearance. After all this analysis, and with the help of an ultra-powerful team to process everything, they made use of the Deepfake technique to place a Luke Skywalker mask on the scene created by Disney in The Mandalorian.

Was your result better? You can judge for yourself by watching his video but, in our opinion, the improvement is remarkable.
CGI of Will Smith in Gemini

Another of the most popular cases of application of these techniques in recent years was in the movie Gemini . In it we can see a 51-year-old Will Smith fighting against another 23-year-old “Will”.
Once again, the generation of graphics through the computer together with facial recognition and AI techniques allowed us to analyze each facial feature of the actor in 2019, and then merge them with images of his appearance when he was just 20 years old. Luckily, Will Smith has a large archive of scenes during this time as in the film Two Rebel Cops , or Men in Black .
Will himself explains part of the process (showing many behind-the-scenes images) in a video on his YouTube channel.
Deepfake to Lola Flores
The latest advertisement from the Cruzcampo brewing company could not be absent as a clear example in which, using the Deepfake technique, the popular singer Lola Flores was resurrected .
If you want to know in great detail how this technique was applied to generate the advertising spot, on the YouTube channel itself the company told the process in a video.
The mobile era: apps to make Deepfakes with your mobile
As we mentioned at the beginning of this article, technologies advance by leaps and bounds and, things for which we previously needed an extremely powerful equipment, now we can do them with our mobile phone.
The Deepfakes technique is another example of this. There are various applications that allow us to take a selfie and, in just a few seconds, place our face on world-famous characters such as Harry Potter , Shakira or Captain Jack Sparrow , among many others.
As you can imagine, these are ready-made models in which, with less calculations than those done in a movie or on TV, it allows us to place our face in them in a simple but quite effective way. We are not going to fool anyone but, to post on social networks or have a laugh is more than enough.

An example of this is the Reface app. As we said, it is as simple as following the steps given by the application and, after taking a selfie and having it analyzed, we can replace our faces instead of well-known characters of different kinds. Of course, we already warned you that if you want to enjoy this app thoroughly, you will have to pay.

Another example is the iface application, available only for Apple phones. The operation is identical to the previous one:
- We start the app.
- He asks us to take a selfie to analyze it.
- We choose from one of the free models that their catalog has and, in just a few seconds, we have our face in that of an actor, singer or well-known public figure.
But, once again, we will have to go to the checkout to take advantage of all the models of this application.