Science

Channel 4 delivers ‘deepfake’ Christmas message

'Deepdake' Christmas message
Channel 4’s ‘Alternative’ Christmas message replicated the traditional annual broadcast (Credit: Channel 4)
Alongside significant developments in computer-generated imagery (CGI) and facial recognition technology, a rising number of heavily edited, misleading videos are entering the mainstream media with broadcasters deciding where they stand on the issue.

By Jack Robert Stacey | Technology Editor

Channel 4’s five-minute Christmas message, presented as an ‘alternative’ to the Queen’s traditional festive speech on BBC One, includes an artificially rendered version of the monarch voiced by English actress Debra Stephenson. In her reflections on the year, the deepfake Queen discussed the challenges of 2020, as well as the departure of senior royals Prince Harry and Meghan Markle before executing a TikTok-inspired dance routine.

According to data compiled by the AI-based artificial video monitoring platform Sensity.ai, there were more than 60,000 deepfakes made in 2020 alone (an increase of over 250% since the previous year).

Essentially, these ‘deepfakes’ (like the broadcaster’s Christmas message) are created by digitally imposing a likeness over an existing image or video, using autoencoders or similar machine learning techniques to replace one individual with another. The ‘alternative’ Christmas message falls into Channel 4’s short season on “the emergence of synthetic media” and is further explored in the recently released Dispatches documentary ‘Deepfakes: Can You Believe Your Eyes’.

In a recent statement, Ian Katz, the Director of Programmes of Channel 4, spoke on the “frightening new frontier” that deepfake technology represents and highlighted the rising “battle between misinformation and truth.” Katz, a former deputy editor of the Guardian, contended that the broadcaster’s Christmas message, “seemingly delivered by one of the most familiar and trusted figures in the nation – is a powerful reminder that we can no longer trust our eyes.”

As one of the first major broadcasters to engage in deepfakes, Channel 4’s light-hearted ‘alternative Christmas message’ intends to comment on the serious impact that misinformation has on our hyper-digital modern society.

The photo manipulation technology behind deepfakes was initially developed by academics in the 19th century and, after much development, eventually saw popular use in the form of amateur work created by small online communities. Although the practice is most commonly used in hoaxes, financial scams, and pornographic communities, the technology has recently been used to create convincing artificial content around significant individuals, such as former US president Barack Obama.

Currently, there is little firm legislation in place that targets the deceptive nature of deepfakes but, with the likely release of more manipulated videos in the future, the government are due to address the topic over the next few months.

Henry Ajder, a prominent expert in the expanding field of deepfakes and other disinformation practices, asserted that: “As a society, we need to figure out what uses for deepfakes we deem acceptable, and how we can navigate a future where synthetic media is a big part of our lives.” Continuing in regard to the ethical side of deepfake videos, Ajder noted that, like all responsible broadcasters, “Channel 4 should be encouraging best practice” and suggested that disclaimers or watermarks be featured before the video is shown.

In our increasingly digital society, with steady developments occurring in accessible forms of artificial intelligence and digital imaging technologies, it is becoming increasingly difficult to distinguish between authentic videos and so called ‘deepfakes’.

Science and Technology Jack Robert Stacey

Add Comment

Click here to post a comment

Your email address will not be published. Required fields are marked *

css.php