By Chris Stokel-WalkerFeatures correspondent
A start-up is developing a news service presented by anchors created by artificial intelligence. Will it upend decades of parasocial relationships between television audiences and the people they watch on screen?
The footage wouldn’t look out of place on many of the world’s news channels.
For 22 minutes, a variety of polished news anchors stand in front of the camera and run down the day’s news in a video posted on social media. But none of them are real. Instead, the anchors are generated by artificial intelligence (AI).
Sign up for Tech Decoded
For more technology news and insights, sign up to our Tech Decoded newsletter. The twice-weekly email decodes the biggest developments in global technology, with analysis from BBC correspondents around the world. Sign up for free here.
The video is produced by Los Angeles-based Channel 1, a start-up created by entrepreneurs Adam Mosam and Scott Zabielski, who plan to roll out AI-generated news on a streaming TV channel later this year. “There seemed to be a very interesting opportunity to level up the user experience of news,” says Mosam, by using AI to tailor content to individuals.
AI technology can also help translate scripts and interviews from one language to another – capabilities that Channel 1 demonstrated in a promotional video, which was shared in December.
Channel 1 is the latest demonstration of AI-powered news presenters around the world. In Kuwait, an AI persona by the name of Fedha ran through the headlines for Kuwait News. Hermes presented the news in May 2023 for Greek state broadcaster ERT. South Korean broadcaster SBS handed over the duties of news presenting to Zae-In, an AI-generated deepfake, for five months this year. There are others in India and Taiwan, too – all created by AI.
But there’s one key question still to be answered: will viewers trust news delivered by AI, rather than humans?
Trust in the humans presenting the news has dropped to an all-time low, according to a survey by public opinion polling firm Ipsos. Just 42% of people in the UK trust TV newsreaders, down 16 percentage points in a year. The scepticism around news presenters as independent arbiters of truth is an unusual modern phenomenon, with many choosing to get their news instead from individual creators or influencers.
Those social media stars leverage a connection with their audience called parasocial influence. First coined in the 1950s by academics at the University of Chicago, parasocial relationships were defined as the belief that viewers of nightly news programmes of the time had that the anchor behind the desk was talking through the camera directly to them. News presenters became more than just journalists telling you the news; they were friends, welcomed into your living rooms night after night.
Social media influencers have co-opted that same direct to camera format and perceived personal connection to great success. “It’s interesting how the ‘parasocial’ label has evolved from describing the affinity that individuals might feel with remote newscasters to something way broader,” says Christine H Tran, who studies digital platforms and labour at the University of Toronto. “You can be in a parasocial relationship with a reporter or news Twitch streamer,” she says, referring to the livestreaming video service. “But you can also – apparently – be in a parasocial relationship with a YouTuber, a singer, and an Instagram power couple.”
But whether AI can replicate the personal connection is less certain. “You’ll never have the same connection with an AI that you do with another human being,” admits Mosam. However, he argues that people are no longer looking for impartiality. “We’re not doing this because we think a robot does a better job than a human – that’s ridiculous.”
The idea of journalists not reading the news isn’t all that unusual, even if the idea that they’re computer-generated is. “When I started in journalism, you had actors reading the news,” says Nic Newman, senior research associate at the Reuters Institute for the Study of Journalism at the University of Oxford, and a former editor at the BBC. “People were kind of fine with that.”
The fact that journalists haven’t always read the news means that this trial could well succeed – with limitations, reckons Newman. It’ll only be useful for short news bulletins, but he’s less certain that viewers will embrace a parasocial relationship with an AI anchor. “For delivering news programmes, I think the humanity is going to remain really important,” he says.
It’s an unknown that Tran is also uncertain about. “Will AI personalities inspire the same parasociality if their broadcasts are accurately labelled as ‘AI content’ and the viewers know there’s no personal life outside the screen?” they ask. “That depends on if the platforms hosting AI presenters will be expected to label their content as AI like some platforms such as Instagram have considered moving towards.”
Channel 1 and NewsGPT, which claims to be the world’s first news channel generated entirely by AI, may have another question to answer: is it possible to totally remove the human from the loop?
At present, Channel 1 has nearly a dozen staff members working on checking AI-generated scripts and selecting stories that ought to be covered. Mosan says there’s a 13-step process that Channel 1 goes through for every story before it is aired to ensure some of the issues around generative AI don’t make it onto air. These include hallucination, where AI tools make up content, which is obviously a no-no in journalism. The company behind it is looking to hire an editor-in-chief early next year.
Being able to actually find newsworthy events and report on them is another element that AI may struggle with, agree Mosan and Newman. The Channel 1 test episode relied heavily on stories unearthed and footage filmed by human journalists. “Without those sources, if they get cut off, I don’t really see how they can do it,” says Newman. “If that raw material is not there, then the AI has absolutely nothing to work on.”
Mosan believes that there are some elements of the reporting process that can be carried out by AI, but others that can’t. “You’ll never be able to gather intelligence person-to-person, and interview person-to-person, effectively,” he says. “But I could fly a drone and analyse what I’m looking at.” Newsgathering solely by AI, without humans in the loop, isn’t in Channel 1’s current set of plans.
—
If you liked this story, sign up for The Essential List newsletter – a handpicked selection of features, videos and can’t-miss news delivered to your inbox every Friday.
Join one million Future fans by liking us on Facebook, or follow us on Twitter or Instagram.