Mysterious New YouTube News Channels Fronted by Deepfake Presenters

Is it fake news if the lips of a seemingly human presenter are synchronized with words actually spoken by a machine? That is the strange question I find myself asking after discovering five YouTube channels that were launched in late 2021 and which are using automated production-line methods to create many short news videos presented by anchors who are deepfakes.

Business Breakfast uploaded 148 videos during the 14 days to January 9, an average of over 10 videos per day. Look at the screengrab from their YouTube account to immediately understand how all the videos use the common representation of a presenter standing in a virtual studio next to a still image which is meant to represent the story being conveyed.

The YouTube page for Business Breakfast says the account was created on October 23, whilst their oldest video dates back to November 29. The creators claim to be in the United Kingdom but only gave this explanation of who they are.

Provide you with newest BUSINESS info.

Most of the videos last only a minute or two in duration, and they typically receive very few views: the most popular has been seen over 2,000 times, but the 10th most popular has only 121 views, and there are 67 videos on the account with fewer than 10 views each. Many of the news stories involve topics of worldwide interest that incorporate words that often feature in internet searches, such as the prices of cryptocurrencies and commodities, the names of tech gadgets and the names of global businesses like Amazon. Many other stories are focused on Indian affairs, even though the presenter is East Asian in appearance. This suggests all of the news stories were supplied by an Indian news agency working in the English language.

You can tell the voice used for Business Breakfast is computer generated because the intonation of words is often wrong, some names of well-known people and places are comically mispronounced, and the computer often fails to include appropriate pauses between words. Furthermore, the voice remains identical even though a small number of earlier videos featured a different female presenter. This begs a question of how the lips have been synced with the words. The lip movements might have been deepfaked to create the illusion that the presenter’s mouth is moving in time with the words, or perhaps the presenter looks real because the makers used video capture of a genuine person but then the animation of the entire face or even the whole body has been deepfaked.

The presenter’s body language is designed to mimic that of normal human behavior but bears no relationship to the words being said. A typical gesture involves showing the palm of one or both hands before clasping them again, but this occurs repeatedly and at odd moments that have nothing to do with the verbal content. If the title of the video is too long to fit in the space for the caption it just disappears out of frame, whilst the text that is being spoken is perfectly reproduced as if they are subtitles. However, the perfect match between the spoken and written word indicates the same text file was used to generate both. Real-life transcription for subtitles is hard because of the rate at which people speak relative to the speed people type, and human speakers never perfectly reproduce the words given to them on a script, so normal subtitles would never be such a perfect match to the spoken words.

The production-line approach to creating these videos is evident from the titles that some of the videos were given immediately after they are uploaded. These were the verbatim titles of the two most recent Business Breakfast videos at the time of writing this article.

Job 20220109 113926 592147145145CES 2022 Highlights: 83 Glimpses of the Future From Tech’s Big Show

Job 20220109 113926 770532509191Facebook’s Data Center Plans Rile Residents in the Netherlands

Evidently somebody updates the titles at a later date to remove the job number included by the person tasked with creating and uploading the lip-synched video files.

This might seem like a bizarre one-off channel except for the fact that several other YouTube channels use the same methods. The YouTube account for VC Invest News was created on September 19 and has been used to upload 269 short news videos during the 14 days to January 9. They only provided the following description for themselves.

welcome to my channel :)) we are here to bring a brand new sound to youtube life, don’t forget to subscribe and like our channel.))

As you can see from the following screengrab, all the videos on VC Invest News follow the same pattern as Business Breakfast. The presenter is shown to the left of a still image that is supposed to reflect the story. Both use a similar virtual background and the size and location of the story image is identical within the frame.

A third YouTube channel called Business Tech News was created on October 23 and also follows the same approach, except the avatar on this channel is male, and hence uses a different computer-generated voice. They uploaded 111 videos during the 14 days to January 9.

A fourth YouTube channel called Venture Capital News was created on September 23 and uses a different male avatar and voice to Business Tech News but has the exact same virtual background. They uploaded 113 videos during the two weeks ending January 9.

A fifth YouTube channel called Awesome Startups was created on October 23 and has a different female avatar but uses the same female voice and the same studio background as Business Breakfast. They were adding new videos at approximate 10-minute intervals as this article was being drafted, and had uploaded 143 videos during the previous fortnight, as counted at the moment before this sentence was written.

Some of the oldest videos on these channels differ because they involve real people using their own voice when talking to camera. For example, the oldest videos on Venture Capital News and Business Tech News feature two different shop owners who are probably based in South East Asia given the variety of languages they use in front of camera. One is showing off her stocks of handbags to live viewers who ask questions that she reads from a screen that is off-camera, whilst the other is similarly answering live questions about the fashion garments she is trying to sell. This suggests all of these news videos are being made by an entrepreneur based in South East Asia who has aspirations to provide marketing services to small businesses as well as trying to generate revenue from YouTube adverts.

It is possible that there is a sixth, a seventh, or many more channels that the same people are using to farm revenues from YouTube. I stopped my research here because my point was already proven. The makers of these videos may not be breaking any YouTube rules, though I find it suspicious that all five of the accounts I identified have amassed slightly over 1,000 subscribers after they were each created in September or October of 2021. This might indicate that the same 1,000 bogus social media accounts were used to inflate the apparent following of each of the new channels. I will report the matter to YouTube if I have time, but feel little motivation as YouTube are likely to fob me off instead of taking any action to prevent viewers from being misled. YouTube always claims that it goes to great lengths to protect copyright but there is a distinct possibility that the text being read by the computer avatars is being reproduced without permission, which would also be a copyright violation. A reputable source of news would explicitly state the source of content reproduced word-for-word, even if it has also paid for the right to license that content.

In some ways this story is ridiculous, and it is not obvious if any harm might result from these silly videos that show pretend people reading recent news about actual events. However, I have observed before that there is a dark side to deepfaking real people. Deepfakes are deceptions, even if they are well intentioned. These YouTube channels prove that technology which can be used to trick facial recognition tools or fool unwary YouTube viewers is already widely available. You do not need to be a spy agency or criminal mastermind to inexpensively create reproductions of people that many will accept as real until they are told to examine them more closely. The risk is genuine, even if the people have been faked.

Eric Priezkalns
Eric Priezkalnshttp://revenueprotect.com

Eric is the Editor of Commsrisk. Look here for more about the history of Commsrisk and the role played by Eric.

Eric is also the Chief Executive of the Risk & Assurance Group (RAG), an association of professionals working in risk management and business assurance for communications providers. RAG was founded in 2003 and Eric was appointed CEO in 2016.

Previously Eric was Director of Risk Management for Qatar Telecom and he has worked with Cable & Wireless, T‑Mobile, Sky, Worldcom and other telcos. He was lead author of Revenue Assurance: Expert Opinions for Communications Providers, published by CRC Press.

Related Articles

Get Our Weekly Newsletter by Email