Channel 4 under fire for deepfake Queen's Christmas message

Channel 4 has sparked controversy and debate with a deepfake video of the Queen as an alternative to her traditional festive broadcast, to be aired on Christmas Day.

The broadcaster will show a five-minute video in which a digitally altered version of the Queen shares her reflections on the year, including the departure of Prince Harry and Meghan Markle as senior royals and the Duke of York’s involvement with the disgraced financier Jeffrey Epstein.

The deepfake Queen, voiced by the actor Debra Stephenson, can also be seen performing a dance routine from social media platform TikTok.

Channel 4 said the broadcast was intended to give a “stark warning” about the threat of fake news in the digital era, with its director of programmes, Ian Katz, describing the video as a “a powerful reminder that we can no longer trust our own eyes”.

Some experts suggested the broadcast might make the public think deepfake technology was more commonly used than is the case.

“We haven’t seen deepfakes used widely yet, except to attack women,” said Sam Gregory, the programme director of Witness, an organisation using video and technology to protect human rights. “We should be really careful about making people think that they can’t believe what they see. If you’ve not seen them before, this could make you believe that deep fakes are a more widespread problem than they are,” he said.

“It’s fine to expose people to deepfakes, but we shouldn’t be escalating the rhetoric to claim we’re surrounded by them.”

Areeq Chowdhury, a technology policy researcher behind deepfakes of Jeremy Corbyn and Boris Johnson during the 2019 general election, said he supported the decision to highlight the impact of deepfakes but that the technology did not pose a widespread threat to information sharing.

“The risk is that it becomes easier and easier to use deepfakes, and there is the obvious challenge of having fake information out there, but also the threat that they undermine genuine video footage which could be dismissed as a deepfakes,” he said.

“My view is that we should generally be concerned about this tech, but that the main problem with deepfakes today is their use in non-consensual deepfake pornography, rather than information.”

Deepfakes expert Henry Ajder said: “I think in this case the video is not sufficiently realistic to be a concern, but adding disclaimers before a deepfake video is shown, or adding a watermark so it can’t be cropped and edited, can help to deliver them responsibly.

“As a society, we need to figure out what uses for deepfakes we deem acceptable, and how we can navigate a future where synthetic media is an increasingly big part of our lives. Channel 4 should be encouraging best practice.”

source: theguardian.com