It started with porn
Deepfake is a new and powerful weapon in the arsenal available to the merchants of lies
At the end of last year a series of pornographic videos began showing up on the internet. This is nothing new, but these were different because they starred some of the world’s top actresses and singers. Naturally, they went viral: millions of people around the world saw them. Very quickly it became clear that Scarlett Johansson, Taylor Swift, Katy Perry, and other artists were not the real protagonists of the sex videos, but rather the victims of a new technology that – using artificial intelligence and other advanced digital tools – allows their creators to insert anyone’s face into a very credible video.
And this was just the beginning. It wasn’t long before Angela Merkel, Donald Trump, and Mauricio Macri were also victims of what is known as “deepfake.” Barack Obama was used, without his consent, to exemplify the possible nefarious uses of the new technology. We can watch Obama saying what the forger wants him to but has never said before. But it is, nevertheless, a very realistic video.
The potential for the videos to exacerbate international conflicts is enormous
Image manipulation is nothing new. Authoritarian governments have a long history of “disappearing” disgraced leaders from official photographs. And since 1990 Photoshop has allowed users to alter digital photographs, a practice that has become so common it is considered a verb by Merriam-Webster.
But deepfake is different. And much more dangerous. In just the year since the fake celebrity porn videos appeared, the technology has improved dramatically. Everything about these videos is hyper realistic, and the person’s voice and gestures are so exactly rendered that it becomes impossible to know it is a forgery without using sophisticated verification programs. And perhaps the biggest danger of deepfake is that the technology is available to anyone.
A distraught ex could create (and anonymously distribute) a video that perfectly imitates the voice, gestures, and face of the woman who left him and in which she appears to be doing and saying the most shameful and degrading things. A video of the police brutally beating an elderly woman who is participating in a street march could provoke violent clashes between protesters and the police. The respected leader of a racial or religious group could incite his followers to attack members of another race or religion. Some students could produce a compromising video of a teacher they despise. Or digital extortionists could threaten a company with disclosing a damaging video, if the company does not pay a hefty ransom.
The possible uses of deepfake in politics, economics, or international relations are as varied as they are sinister. The release of a video showing a presidential candidate saying or doing reprehensible things shortly before the elections will certainly become a more commonly used election trick. Even if the candidate’s opponent doesn’t approve the hoax, his most radical followers can produce and distribute the video without asking for anyone’s permission.
The possible uses of deepfake in politics, economics, or international relations are as varied as they are sinister
The counterfeit videos’ potential to cloud relations between countries and exacerbate international conflicts is also enormous.
And this is not hypothetical. It has already happened. Last year, the Emir of Qatar, Tamim bin Hamad al-Thani, appeared in a video praising and supporting Hamas, Hezbollah, the Muslim Brotherhood, and Iran. This provoked a furious reaction from Saudi Arabia, the United Arab Emirates, Bahrain, and Egypt, countries that already had strained ties with Qatar. They denounced the emir’s speech as supporting terrorism, broke diplomatic relations, closed the borders, and imposed a blockade by air, sea, and land. The reality, however, is that the Emir of Qatar never gave that speech; while the video that escalated the conflict was not produced with deepfake technologies it was sufficient to provoke a dangerous escalation of the conflict that was already simmering. The video was still a fake but the boycott that resulted is very real, and remains in force.
The threat that deepfake represents to social harmony, democracy, and international security is obvious. The antidotes to this threat are much less clear, although there are some proposals. All organizations that produce or distribute photographs or videos should be forced to use technology blocks that make their visual and audio material unalterable. People must also have access to technologies that protect them from being victims of deepfakes. Laws must be adapted so that those who defame or cause harm to others through the use of these technologies can be brought to justice. The ease with which it is now possible to operate anonymously on the web should not be tolerated. All this is necessary, but insufficient. We will need to do much more.
We have entered an era in which the ability to differentiate the truth from lies, facts from fiction, is being eroded. And with it, trust in institutions and in democracy. Deepfake is another new and powerful weapon in the arsenal that the merchants of lies have at their disposal.
We have to fight them.
Twitter @moisesnaim
Tu suscripción se está usando en otro dispositivo
¿Quieres añadir otro usuario a tu suscripción?
Si continúas leyendo en este dispositivo, no se podrá leer en el otro.
FlechaTu suscripción se está usando en otro dispositivo y solo puedes acceder a EL PAÍS desde un dispositivo a la vez.
Si quieres compartir tu cuenta, cambia tu suscripción a la modalidad Premium, así podrás añadir otro usuario. Cada uno accederá con su propia cuenta de email, lo que os permitirá personalizar vuestra experiencia en EL PAÍS.
En el caso de no saber quién está usando tu cuenta, te recomendamos cambiar tu contraseña aquí.
Si decides continuar compartiendo tu cuenta, este mensaje se mostrará en tu dispositivo y en el de la otra persona que está usando tu cuenta de forma indefinida, afectando a tu experiencia de lectura. Puedes consultar aquí los términos y condiciones de la suscripción digital.