The real story of the news website accused of fuelling riots

The real story of the news website accused of fuelling riots Getty Images Police stand before a burning car in Sunderland – one of several towns and cities hit by riots What connects a dad living in Lahore in Pakistan, an amateur hockey player from Nova Scotia – and a man named Kevin from Houston
The real story of the news website accused of fuelling riots

The real story of the news website accused of fuelling riots

Getty Images Police officers wearing riot gear in Sunderland, viewed from behind close to the camera, as in the middle distance an overturned car burns and a crowd mills aroundGetty Images
Police stand before a burning car in Sunderland – one of several towns and cities hit by riots

What connects a dad living in Lahore in Pakistan, an amateur hockey player from Nova Scotia – and a man named Kevin from Houston, Texas?

They’re all linked to Channel3Now – a website whose story giving a false name for the 17-year-old charged over the Southport attack was widely quoted in viral posts on X. Channel3Now also wrongly suggested the attacker was an asylum seeker who arrived in the UK by boat last year.

This, combined with untrue claims the attacker was a Muslim from other sources, has been widely blamed for contributing to riots across the UK – some of which have targeted mosques and Muslim communities.

The BBC has tracked down several people linked to Channel3Now, spoken to their friends and colleagues, who have corroborated that they are real people, and questioned a person who claims to be the “management” at the site.

What I found appears to be a commercial operation attempting to aggregate crime news while making money on social media. I did not find any evidence to substantiate claims that Channel3Now’s misinformation could be linked to the Russian state.

The person claiming to be from Channel3Now’s management told me that the publication of the false name “shouldn’t have happened, but it was an error, not intentional”.

The false article did not have a named byline, and it is unclear exactly who wrote it.

———

A Nova Scotia amateur hockey player called James is the first person I track down linked to Channel3Now. His name appears as a rare byline on the site on a different article, and an image of him pops up on a related LinkedIn page.

A Facebook account linked to James has just four friends, one of whom is named Farhan. His Facebook profile says he’s a journalist for the site.

I message dozens of their followers. A social media account for the school where James played hockey, and one of his friends, confirm to me he is a real person who graduated four years ago. When I get in touch, his friend says James wants to know “what would his involvement be about in the article?”. After I respond, there is no denial James is affiliated with the site – and his friend stops replying.

Former colleagues of Farhan, several based in Pakistan, confirm his identity. On his social media profiles he posts about his Islamic faith and his children. His name is not featured on the false article.

Not long after I message, Farhan blocks me on Instagram, but I finally hear back from Channel3Now’s official email.

An archived screenshot of a Channel3Now story giving a false name for the Southport attacker, wrongly claiming he was an asylum seeker and incorrectly saying he was on "an MI6 watch list"
Channel3Now later apologised for incorrectly naming the Southport attacker

The person who gets in touch says he is called Kevin, and that he is based in Houston, Texas. He declines to share his surname and it is unclear if Kevin is actually who he says he is, but he agrees to answer questions over email.

Kevin says he is speaking to me from the site’s “main office” in the US – which fits with both the timings of the social media posts on some of the site’s social media profiles, and the times Kevin replies to my emails.

He signs off initially as “the editor-in-chief” before he tells me he is actually the “verification producer”. He refuses to share the name of the owner of the site who he says is worried “not only about himself but also about everyone working for him”.

Kevin claims there are “more than 30” people in the US, UK, Pakistan and India who work for the site, usually recruited from sites for freelancers – including Farhan and James. He says how Farhan in particular was not involved in the false Southport story, which the site has publicly apologised for, and blamed “our UK-based team”.

In the aftermath of the false claims shared by Channel3Now, it was accused of being linked to the Russian state on the basis of old videos on its YouTube channel in Russian.

Kevin says the site purchased a former Russian-language YouTube channel which focused on car rallies “many years ago” and later changed its name.

There were no videos posted to the account for around six years before it began uploading content related to Pakistan – where Farhan is based and where the site admits to having writers.

“Just because we purchased a YouTube channel from a Russian seller doesn’t mean we have any affiliations,” Kevin says.

“We are an independent digital news media website covering news from around the world.”

It is possible to buy and re-purpose a channel that has already been monetised by YouTube. It can be a quick way to build an audience, enabling the account to start making money right away.

‘As many stories as possible’

Although I’ve found no evidence to back up these claims of Russian links to Channel3Now, pro-Kremlin Telegram channels did reshare and amplify the site’s false posts. This is a tactic they often use.

Kevin said the site is a commercial operation and “covering as many stories as possible” helps it generate income. The majority of its stories are accurate – seemingly drawing from reliable sources about shootings and car accidents in the US. However, the site has shared further false speculation about the Southport attacker and also the person who attempted to assassinate Donald Trump.

Following the false Southport story and media coverage about Channel3Now, Kevin says its YouTube channel and almost all of its “multiple Facebook pages” have been suspended, but not its X accounts. A Facebook page exclusively re-sharing content from the site called the Daily Felon also remains live.

Kevin says that the blame for social media storm relating to the Southport suspect and the subsequent riots cannot be laid squarely on a “small Twitter account” making “a mistake”.

To some extent, he is right. Channel3Now’s incorrect story did become a source cited by lots of social media accounts which made the false accusations go viral.

Several of these were based in the UK and the US, and have a track record of posting disinformation about subjects such as the pandemic, vaccines and climate change. These profiles have been able to amass sizeable followings, and push their content out to more people, following changes Elon Musk made after buying Twitter.

Reuters A man in a black hoodie with his face covered by sunglasses and a balaclava throws a stick, while in the background are overturned bins, a fire and a large crowd of peopleReuters
More than 400 arrests have been made during the outbreaks of disorder

One profile – belonging to a woman called Bernadette Spofforth – has been accused of making the first post featuring the false name of the Southport attacker. She denied being its source, saying she saw the name online in another post that has since been deleted.

​​Speaking to the BBC on the phone, she said she was “horrified” about the attack but deleted her post as soon as she realised it was false. She said she was “not motivated by making money” on her account.

​​“Why on earth would I make something up like that? I have nothing to gain and everything to lose,” she said. ​​She condemned the recent violence.

Ms Spofforth had previously shared posts raising questions about lockdown and net-zero climate change measures. However, her profile was temporarily removed by Twitter back in 2021 following allegations she was promoting misinformation about the Covid-19 vaccine and the pandemic. She disputed the claims and said she believed Covid is real.

​​Since Mr Musk’s takeover, her posts have received more than a million views fairly regularly.

The false claim that Ms Spofforth posted about the Southport attacker was quickly re-shared and picked up by a loose group of conspiracy theory influencers and profiles with a history of sharing anti-immigration and far-right ideas.

Many of them have purchased blue ticks, which since Mr Musk took over Twitter has meant their posts have greater prominence.

Another of Mr Musk’s changes to X has meant promoting these ideas can be profitable, both for conspiracy theory accounts and for accounts with a commercial focus such as Channel3Now.

Millions of views

Some profiles like this have racked up millions of views over the past week posting about the Southport attacks and subsequent riots. X’s “ads revenue sharing” means that blue-tick users can earn a share of revenue from the ads in their replies.

Estimates from users with fewer than half a million followers who have generated income in this way say that accounts can make $10-20 per million views or impressions on X. Some of these accounts sharing disinformation are racking up more than a million impressions almost every post, and sharing posts several times a day.

Other social media companies – aside from X – also allow users to make money from views. But YouTube, TikTok, Instagram and Facebook have previously de-monetised or suspended some profiles posting content that break their guidelines on misinformation. Apart from rules against faked AI content, X does not have guidelines on misinformation.

While there have been calls from politicians for social media companies to do more in the wake of the riots, the UK’s recently enacted Online Safety Bill does not currently legislate against disinformation, after concerns that that could limit freedom of expression.

Plus, as I found tracking down the writers for Channel3Now, the people involved in posting false information are often based abroad, making it a lot trickier to take action against them.

Instead, the power to deal with this kind of content right now lies with the social media companies themselves. X has not responded to the BBC’s request for comment.

Total
0
Shares
Leave a Reply
Related Posts