This means we mainly encounter fake news stories in our social feeds as attention grabbing headlines which cultivate an emotional reaction in us, whether that's anger, sadness, or amusement. Publishers of fake news stories do so for a range of reasons. They might do it simply to make money from advertising revenues generated by users clicking on their "click bait" headlines. On the other end of the spectrum, in the words of a report from Facebook's Security team two weeks ago, "we identified malicious actors on Facebook who, via inauthentic accounts, actively engaged across the political spectrum with the apparent intent of increasing tensions between supporters of these groups and fracturing their supportive base." i.e. some publishers of fake news do so to cause friction in our society. That last motive is particularly concerning for us in the UK as we head to a general election.
It's easy to believe that we're immune to manipulation, but truth be told, us human beings are an emotional lot. Whether it's a toilet roll advert with a puppy that made you cry, or an advert that empowered you by saying a girl can grow up to be whatever she wants. Things with an emotional message tend to stick around in our subconscious and influence us. Take for example the slogan "take back control" from the UK's Vote Leave campaign vs. the Remain campaign's... I can't remember it, can you?
Back to fake news. It's an incredibly hard problem to solve because of the scale of the issue and the difficulty in drawing a line where opinion becomes fake news. It's something that all social platforms are battling with, although Facebook faces the most attention as it counts a quarter of the world's population as members.
Facebook's answer has been to allow users to flag stories they feel are fake to third party fact checkers. The third party checks it, and if they agree it is fake, Facebook displays a "disputed" label next to the story. People are also warned it is "disputed" when they try to share it.
It's better than nothing but it's not a fully formed solution for three reasons:
First of all, there are quite literally millions of pieces of content shared on Facebook every minute. Even if a tiny amount of this is fake news that's a lot of content to check. Right now these checks are being done by humans and there's a limit to the volume of content they can review.
Secondly, if a story is deemed to be fake it is labelled as "disputed" rather than being removed from the site. Going back to my point about emotions, if I strongly believe that something is true, a disputed flag is not going to cause me to re-evaluate my views. Last year over 2 million people interacted with a fake story which claimed Obama banned the pledge of allegiance in American schools. If I already distrusted Obama, placing a "disputed" flag next to the story would not stop the story subconsciously influencing me.
Finally, trust in the media is falling. 24% of British people trust the media. In America only 14% of Republicans trust the press. Facebook is treading a fine line where they could very easily be perceived as part of the "crooked media" downplaying another "alternate fact".
So as emotional, easily swayed human beings we are the worst people at saving ourselves from being manipulated by fake news. It is worrying then that Facebook is asking its flawed, human users to spot and report fake stories. They recently took out ads in British newspapers with tips on how to detect fake news. This just doesn't make sense. Picture a world where unsafe cars were made and sold. Imagine if it was the responsibility of the person buying the car to inspect the entire engine to check it was roadworthy. Would we think this was acceptable?
The most important reason why it's not helpful to place the responsibility on the consumer is that many people within our society haven't been fortunate enough to have access to a good education. 16% of the UK's population is functionally illiterate. That means that they can read and process simple, familiar text but they have difficulty looking at a piece of text which has contradicting views, processing it and drawing their own conclusions.
Tabloid newspapers are written to be accessible to all readers, for example the Sun has a reading age of that of a 8 year old. Does Facebook really believe their checklist will help all Brits, including those who are most vulnerable to fake news stories? Their choice of newspapers to place these adverts (The Guardian, The Times and The Telegraph) which have higher reading ages suggests not.
We should demand the standard of quality we experience with other products from social platforms. Facebook's current offerings are a good start to tackle the problem but they're not the end solution. Some of us may not have been fortunate enough to have had a good education. Some of us may arrogantly believe we're immune to emotional influences (when we really aren't). Whichever camp we fall into, it's time to say enough is enough. Let's ask the organisations who are profiting from the impact they have on our society to take proper responsibility for their actions.
-- This feed and its contents are the property of The Huffington Post UK, and use is subject to our terms. It may be used for personal consumption, but may not be distributed on a website.