The spread of realistic-looking doctored videos known as deepfakes is nothing new.
In fact, we’ve reported on it in depth as part of the Future of Everything series. But a pair of new videos doing the rounds on the internet yesterday has been causing quite a stir.
Future Advocacy, a London-based think tank released deepfake videos of Boris Johnson and Jeremy Corbyn seeming to back each other in the forthcoming general election.
‘As far as we know this is the first time deepfakes have been released during a live election cycle,’ the think tank told Metro.co.uk.
While the videos were meant as a rallying cry to address the problems of deepfakes online, some commentators have suggested that making the videos is causing more harm than good. The reason? Well, because some people may not be able to tell the difference.
While the voices in the videos don’t sound exactly like that of Johnson or Corbyn, the likeness is enough to possibly convince people watching in a hurry or not really paying attention.
Future Advocacy stands by the videos and says that over the last few years politicians have collectively failed to address the issue of disinformation online.
‘Deepfakes represent a genuine threat to democracy and society more widely. They can be used to fuel misinformation and totally undermine trust in audiovisual content,’ said Areeq Chowdhury, the head of Future Advocacy.
‘The responsibility for protecting our democracy lies in the corridors of Westminster not the boardrooms of Silicon Valley. By releasing these deepfakes, we aim to use shock and humour to inform the public and put pressure on our lawmakers.
‘This issue should be put above party politics. We urge all politicians to work together to update our laws and protect society from the threat of deepfakes, fake news, and micro-targeted political adverts online.’
Writing for Metro.co.uk, futurist speaker and author Richard Watson said: ‘the sophistication of deepfakes is such that even an expert could struggle to tell what’s real and what’s not.
‘We’re not talking about ventriloquist dummies here. We’re taking about things that look, sound and feel utterly convincing. This might all sound trivial, and many of the examples are, but sometimes things can get more serious. What about a ‘leaked’ video showing soldiers shooting innocent children in Belfast? This could be fake, aimed at inciting support for someone and it could easily end up with people being killed for real.
‘By the time the forgery has been spotted it’s too late. Things can and do escalate, with ‘unbelievable’ videos in particular becoming viral hits almost instantly.’
Source: Read Full Article