«

»

For the sake of “real” news, Google must build public oversight of its new search algorithm

By Simon Davies

Over the past few weeks, there have been reports that Google’s new search algorithm – intended to demote fake news and conspiracy theories – has in fact been censoring legitimate news sites. More worrying still, it’s arguable that the world’s most extensive censorship regime has just begun with hardly a whisper from mainstream media.

A number of reports claim that left wing, anti-war and progressive web sites have experienced a sharp fall in traffic generated by Google searches. The World Socialist Web Site has seen, within just one month, a 70 percent drop in traffic from Google. 

In its blog, Google argues that a body of ten thousand “evaluators” will drive the new system and that these people must abide by the company’s Search Quality Rater Guidelines. These ratings will “provide more detailed examples of low-quality webpages for raters to appropriately flag, which can include misleading information, unexpected offensive results, hoaxes and unsupported conspiracy theories.”

This is a rather bizarre assertion. If a conspiracy claim were to be supported, surely it would be “real” news. If, because of state or corporate censorship – or because of such devices as gagging orders – the story will remain a conspiracy. That is, unless a mainstream outlet publishes it.

Setting aside for a moment the practical considerations involved in creating fairness and accuracy in such a system, Google is historically in dangerous waters. Some of the most important stories of recent decades started as conspiracy theories.


Setting aside for a moment the practical considerations involved in creating fairness and accuracy in such a system, Google is historically in dangerous waters. Some of the most important stories of recent decades started as conspiracy theories. Watergate was a conspiracy, as were claims of LSD experiments conducted by the CIA. The Gulf of Tonkin incident in the Vietnam War, the Bildaberg and Bohemian Grove gatherings of world elites, the Abraham Lincoln assassination, the tobacco/cancer cover-up, official deception over Saddam Hussein’s Weapons of Mass Destruction, murder plots by MI6, Operation Northwoods… the list goes on and on. How would Google’s new algorithm handle such reports?

I’ve documented on these pages my own national security work – an activity that until fifteen years ago, almost all mainstream media had dismissed as an unreportable conspiracy theory.

It was 1997, and a stream of whistleblowers and researchers such as James Bamford, Duncan Campbell and Steve Wright had been warning for some years about an institution called the National Security Agency (NSA) and its vast communications interception web. My role was to find a way to bring this remarkable story to the attention of major media and the parliaments.

I pitched the story to the editors of a dozen smaller publications, all of whom treated me as if I was mad. As a media specialist, this failure was intensely frustrating and seemed irrational. I’d sold a thousand stories to the press, and this one seemed to me perfect. Even if material evidence wasn’t in abundance, it was a rollicking good yarn peppered with all the elements of a great thriller. But no; not a single media outlet bit on the story. They focused instead on arguing that I needed a holiday – and perhaps some medication.

The reality is that media has an ingrained denial process. Google needs to understand this dynamic. If the story involves anything unknown – where there’s no corroboration by an official – smaller media will almost never risk publishing unless the claim had been validated by a large established newspaper. Large established papers in turn rarely publish unless the claim has been referenced in another equally established paper.  And big papers don’t follow small ones unless there’s institutional comment. A few reports written by academics – even for the European Commission – didn’t cut it. Thus, a vicious cycle.

So, the reason for the continued silence back then about the world’s biggest spy system, controlled by US security, operating outside national jurisdiction and turning each country into a spy for the others, was purely down to the fact that it existed outside media’s frame of reference – a sort of modern day Emperor with no clothes.

The exact opposite is true of current NSA awareness. It’s as hard these days to find people who doubt the story, as it was in 1997 to find people who believed it.

Depressingly, I continued to draw blanks across the media spectrum. One of the Guardian editors responded “we have enough ammunition on the Special Relationship without adding a conspiracy theory”.

In a last-ditch effort I targeted the Daily Telegraph. As the Conservative flagship it was the most unlikely place for a story that exposed US security – which was the very reason why its’ backing would carry weight with other media. Under the editorship of Charles Moore the paper was slowly turning 180 degrees into a freedom fighter (which by the way is one of the great untold heroic stories of British media).

I approached a section editor there – Ben Rooney – who had published my work before. He agreed to commission me to write the article, adding the caveat that both our necks were on the line. For the Telegraph – steeped in loyalty to the Anglo-American alliance – this was a brave move.

On December 16th, 1997 the Telegraph published the article as a supplement cover story. Nine days before Christmas and with all parliaments closed, it was the worst possible time to publish, but still, there it was – in print and in colour.

By February of the following year talk of the global spying system was common currency, and much fewer people doubted its existence.

The story spread to country after country as serious newspapers plagiarised or syndicated the Telegraph article.  Excited colleagues in Germany, the US, France and on to Israel, Canada and Brazil called to express their concerns. And, predictably, people were convinced that because the story had originated in the Daily Telegraph, it was unlikely to be a hoax.

So, how would Google’s new algorithm deal with that situation? It seems it wouldn’t. If Google’s censors simply rely on legitimisation in mainstream media, then a significant spectrum of commentary and reporting will be buried.

Well, section 7.6 of Google’s rating guidelines gives us a clue. It begins “Sometimes, pages just don’t feel trustworthy.”

Feel”? Seriously? How would Watergate or the NSA have felt? The section continues “ Pages that appear highly untrustworthy should be rated Lowest, even if you’re not able to completely confirm their lack of trustworthiness.”

Google does suggest that the traffic generated by a site could be taken into account, but that should not be the key determinant. This guideline creates a real mess. True, there are many outlandish conspiracy theories out there, but if the Internet existed in 1971, how might the Raters assess commentary about Gary Allen’s groundbreaking book “None Dare Call it Conspiracy? Never mind that it sold four million copies; it was a work that advanced a (then) bizarre theory that international banking and politics have conspired to control domestic decision-making. Wow, try and find someone these days who doesn’t believe that proposition.

Rating and prioritising the world’s information isn’t like rating a restaurant. The task – if it is to have any real value – must be conducted in a sensitive and open manner. Google should see it as a scientific exercise, not one based on mass “feel”. That means opening the process up to oversight and scrutiny. Without those elements, the company will hold real news Hostage to Fortune.