FYI.

This story is over 5 years old.

News

YouTube doesn’t know why Alex Jones conspiracy theories top results for “Austin explosions”

Alex Jones' Infowars already has two strikes against it for spreading conspiracy theories on YouTube

An InfoWars video claiming that Antifa members are the “prime suspects” in the mysterious package bombings in Austin, Texas appeared at the top of search results Monday on YouTube — and the company has no idea why.

The video appeared as the third result when VICE News searched for the term “Austin explosions” on Monday evening. Another Infowars video was listed fourth in search results.

The videos were posted by Ron Gibson, who is part of Jones’ Free Speech Systems YouTube network. His channel alone has over 158,000 subscribers and the video which was appearing on YouTube’s search results has been viewed more than 9,000 times.

Advertisement

In the first video Jones asks: “Is Antifa behind the third bombing in Austin?” before answering his own question by calling them “prime suspects.” His evidence? “They are violent, they call for violence, they call for attacking gentrification, any old white people moving into East Austin.” He goes on to say that the accusation is “not a conspiracy theory” despite providing no supporting evidence.

Jones also says Antifa is “listed as a terror group.”

While the videos no longer appeared in search results Tuesday morning, others also saw the Infowars video appear at the top of their search results:

YouTube told VICE News that it has done a lot of work to clean up its search results to prioritize verified news sources, including the introduction of the Top News section and it is rolling out algorithmic changes to YouTube search during breaking news events to surface more authoritative content

“There is still more work to do, but we're making progress,” a YouTube spokeswoman told VICE News.

Read: YouTube admits Florida shooting conspiracies shouldn't have been trending

Infowars has been emblematic of the criticism YouTube has faced in recent months over extremist content and hate speech on its platform. An Infowars video which claimed some of the survivors of the Parkland school shooting were paid actors hit the number one position on YouTube’s trending section in the aftermath of the massacre on Feb. 14.

Advertisement

Jones’ channel has received two official warnings in the last month. A third warning in the space of three months would automatically trigger a ban. YouTube did not respond to a question about whether this latest video would represent a third strike for Infowars.

While YouTube says it has invested in technology to limit the appearance of conspiracy theory videos in search results for breaking news incidents, the search results VICE News saw did not include a “Top News” section, despite the fact the feature has been rolled out in Ireland, where the search was conducted.

Read: YouTube suspended ads on Logan Paul's channel after he Tasered dead rats

A YouTube spokesman said the team was baffled by why the Infowars video had appeared at the top of the search results. They could not provide an explanation for the discrepancy.

YouTube has been monitoring search results, especially during breaking news events to see how the improvements to the system were performing, and it had been happy with how it was working Monday. "What we had seen today had been pretty good.”

YouTube has a history of handing power over to its algorithm and then subsequently clawing it back in response to controversies.

"If you look at what Google has done historically, they have pushed algorithmic updates out and maybe it hasn't worked as well as they hoped"

"If you look at what Google has done historically, they have pushed algorithmic updates out and maybe it hasn't worked as well as they hoped and they have to roll them in again,” Andy Barr, founder of 10 Yetis, a digital media agency, told VICE News. “So it's not a simple thing for them to try and fix this."

Advertisement

The problem YouTube faces is dealing with content that is not obviously hate speech, and doesn’t explicitly breach its terms of service, videos that are in the “grey area” according to Nikita Malik, director of the Centre for the Response to Radicalisation and Terrorism at The Henry Jackson Society.

“These videos may not directly advocate for violence, but they use symbols, rhetoric, and techniques that are common knowledge to the group of 'insiders' that follow the group,” Malik told VICE News.

YouTube’s platform has always been gamed by people looking to provoke or make a profit, such as this bogus science video which has been watched more than 11 million times, earning its producers tens of thousands of dollars in ad revenue.

As a result of these problems, advertisers are running scared, with giants like Unilever threatening to pull their ads from the service. “YouTube is struggling to come up with the algorithmic formula and human curation that will set editorial priorities and promote better quality information,” Charlie Beckett, a professor in the department of media and communications at the London School of Economics, told VICE News.

Barr, who was a social media advisor to both the Conservatives and the Labour Party in the U.K. in the past, added that it could take political intervention before YouTube is finally forced to fix these problems.

“I think once it reaches the stage where politicians are talking about it, that's where historically Alphabet have really stepped up their game and thrown a lot more resources at it," Barr said.

Cover image: Leslie Xia