Photo Credit: Duncan Hall/Flickr
Societies right across the globe have begun waking up to the impact and massive consequences of 'fake news'.
The manipulation of fake news distributed seamlessly on Facebook and Google has had direct effects on the outcomes of elections in Europe, the US and the Pacific.
Even before Brexit and Trump, lies, rumours, mud-slinging and gossip spread without any checks on Facebook helped elect populist president Rodrigo Duterte in the Philippines.
The phenomenon next moved onto the UK where the Leave campaign, with its messages of nationalism, xenophobia and anti-EU rhetoric, was five times louder on social media than the Remainers.
In the US, the swing to Donald Trump in the last weeks of the presidential election was helped enormously by industrial levels of fake news spread on Facebook and sponsored by Russia.
Stimulating negative emotions
A new study by the University of Oxford has shown that nearly a quarter of web content shared by Twitter users in the battleground state of Michigan during the final days of the US election was so-called fake news. In Spain last year three of the most commented-on stories on social media were all false. All the stories - the introduction of conscription, limiting access to third-level education and a ban on religious processions so as not to offend Muslims - were designed to stimulate negative emotions.
This is at the heart of the matter. Facebook is an emotional medium. Something that makes you angry, sad or happy is most likely to be shared with your friends. Fake news, with its make-up of lies and simplifications, is designed to make the user angry about a subject or person and is hugely manipulative. The more the lies are shared, the more public opinion is shaped.
A meeting of Unesco in Paris last month heard that fake news - or as the Oxford Internet Institute (OII) calls it "junk news" - is eroding democratic institutions. The problem has become so big that scientific facts are now openly being questioned - such as the link between smoking and certain types of cancer and, of course, climate change.
You would think that, given the enormous consequences for society, Facebook would be taking the matter seriously.
Well, think again. Facebook has become fat and powerful from having no checks and balances on the content it carries.
Between them, Facebook and Google have two thirds of the entire digital advertising market and it's estimated they accounted for 90pc of incremental increases in digital advertising over the last year.
Unwilling to take on responsibility
Paul Nemitz, of the EU Commission's Justice Directorate, told the Unesco conference Journalism Under Fire that the social media giants are draining revenue from other media without any of the responsibilities of a plural media. Both Facebook and Google should be made accountable and responsible for upholding democracy, he said.
He also pointed out that instead of hiring people to ensure fake news and hate were not distributed, the social media platforms were asking NGOs to do the work for them. Neither was Facebook co-operating with calls for data on the level of hate being spread on the internet. The companies are, however, spending millions on lobbyists and lawyers in Brussels and Washington DC to ensure they don't become regulated or forced to share data on junk news and hate.
In Sweden, news organisations who asked Facebook to do something about the problem were asked if they would "clean up" the platform themselves - in other words asking the media to call out fake news for the giant.
The only place where the social media platform has actually hired people to screen for junk news is Germany which has threatened and imposed fines on the tech companies. Germany knows well the dangers of unchecked propaganda and lies and how dangerous rhetoric can undermine democracy in a short space of time.
The power behind Facebook is its algorithms. The algorithm is all-knowing. It means Facebook knows more about you, your friends and your country than anyone or anything. The tech giant can measure sentiment and emotion in any region of the globe.
The algorithm is clearly adjusted to maximise the company's commercial prowess. It works by feeding users what it knows they like. Someone with Right-wing views will be fed Right or Alt-Right content, whether true or false. During the US election, teenagers working in so-called 'Click Farms' in Macedonia were pumping out fake news that favoured Donald Trump. The teenagers were allegedly sponsored by Russia and making thousands of dollars a week from the ads Facebook placed around their junk.
The social media platforms blame the algorithms and not themselves when fascist or hate material surfaces on your feed, washing their hands of the problem. And even if you change your political views, you can't change the algorithm - it will keep feeding you the same content. No button exists called "Clear My Algorithm".
Over at Google there are also problems, although its power to influence an electorate through emotion is not as seismic.
Google and YouTube have been in trouble over the last month after it was revealed they were paying millions in advertising dollars to hate groups, jihadists and anti-Semites. Google was paying those who hate western democracy by putting ads for big brands around their hate videos.
Until recently, a video on YouTube showed how to knife a police officer wearing a stab-proof vest, similar to the way PC Keith Palmer was murdered by an Isil sympathiser in London. Responsible and brand-aware advertisers here and in the UK pulled their content from YouTube when they became aware their ads were being used around porn, neo-fascist and Islamist videos. While this was on the one hand embarrassing for the advertisers, on a more serious level, it meant the brands were, in effect, funding the haters.
Similarly the recommendations you are fed by Google on searches has been found to favour the tech giant's own products. The Wall Street Journal found that in 25,000 random Google searches, ads for its own merchandise appeared in the most prominent slots.
All 1,000 searches for "laptops" started with an ad for Google's own Chromebook. How is that impartial?
The boycott by major media agencies and blue-chip brands forced Google to re-think its model. Some advertisers have returned but are limiting their advertising to music videos on YouTube.
Over at Facebook, there is still a reluctance to carry out verification and fact checking.
Why is it then that companies like Facebook and Google, which earn billions, cannot be held responsible for monitoring their content? As one commentator wrote last week, they are happy to monetise but not monitor.
Philip Howard of the Oxford Internet Institute at the University of Oxford says that the social media platforms are now more important than pollsters, because they know exactly what way an electorate is thinking. Moreover, he says that Facebook is directly serving up fake news to voters before they vote, but will not take any responsibility for this.
He recommends that as a matter of public policy, Facebook should have its delivery algorithms audited. Facebook claims it does not know why a voter gets a particular post and this, he says, is a major problem. The company will also not collaborate with social scientists. If democratic values are to remain strong - in an era when fewer than one third of Millennials think it is important to live in a democracy - Howard says the company must have an audit of its algorithms.
We live in a social media era where facts are no longer sacred, where, because of junk news, climate change facts are challenged and where manipulation of Facebook can sway electorates.
At the Paris conference, Frank La Rue, assistant director general of Unesco, summed it up: "Truth is not the result of an algorithm. Journalism is the honest intention of informing the public."
Stephen Rae is Group Editor-in-Chief at INM, publishers of the 'Sunday Independent'. He serves on the board of the World Editors Forum and is a member of the Expert Advisory Panel on Facebook