Will Reporting Fake News On Facebook Actually Solve The Problem?

Will Reporting Fake News On Facebook Actually Solve The Problem?

It had to come to the victory of Donald Trump in the US presidential elections to cause an uproar due to a flood of fake news on this social network and for Mark Zuckerberg to realize the seriousness of the problem, giving him a chance to announce a fierce fight against doubtful content. But the fight, at least in this form, is lost in advance, I'm sure. That is why I wrote a long text to convince you as well.

fakti_1naslovna

After Donald Trump won the elections, the expressions on many faces reflected disbelief, except for those who accurately predicted his victory (like a Croatian team of analysts), and then anger, followed by a loud search for the “culprits”. One of them was found on Facebook, or more precisely, in the fact that this social network became fertile ground for spreading fake news that, according to some estimates, caused the polarization of voters and thus influenced the election results.

At first, Mark Zuckerberg denied these claims, so that only a couple of days later he could announce his plan to tackle the issue of false content seriously. The problems here are complex, both technically and philosophically, he said, and he wasn’t wrong. On the one hand, Facebook gives voice to people and this voice can reach hundreds, thousands, millions of others, but on the other hand, it determines what out of this content will have a bigger reach and what not.

How the algorithm (unsuccessfully) replaced the editors

A similar problem occurred with the trending part of Facebook – here, the social network hired a sort of “editors” who would assess which popular topics were really worth underlining. However, soon they were dismissed due to the criticism of being biased, and the algorithm was allowed to do its magic. Well, we’ve seen that the algorithm let “interesting” things go through – such as the “news” about the Fox News presenter Megyn Kelly being fired because she was secretly supporting Hillary Clinton as the presidential candidate. What was the problem? The news was fake. It was conveyed by the conservative page EndingtheFed.com from the page Conservative101.com, for which this is not the first fake viral news (they stated that Tom Hanks supports Donald Trump, even though the truth was exactly the opposite).

However, the article was shared many times, enough to meet the requirements for becoming a trending – and it got an extra reach. And because everything was in the hands of AI, no one stopped it… And the story circulated.

A lot of you have asked what we’re doing about misinformation, so I wanted to give an update.

The bottom line is: we…

Posted by Mark Zuckerberg on Friday, 18 November 2016

After the election Zuckerberg promised to step up the efforts in identifying fake news, to ease their reporting, to set up warnings that certain news is potentially fake news, and, what is perhaps more interesting, to seek advice from the media people once again – this time not from his own editorial team, but a team of experts who would check the facts and would include people from ABC News, Politifact, FactCheck, Snopes and The Associated Press.

The procedure should go like this – after a user reported fake news by clicking the button in the top right corner and after the software check indicated that the news is suspicious, a consortium of journalists would check the facts. If it decided that it is false news (at least two of above mentioned organizations have to state this), the news would be labeled “disputed by third-party checkers” in the news feed. The links with this label would have a smaller reach and if users wanted to share them, a warning would pop up that they should consider whether they really wanted to do this as the news might be false.

It’s an interesting idea, but… I doubt it can work. Here are a few reasons.

There are too many motives for spreading fake news

Let’s take the American elections as an example again. Either it’s about political reasons and desire to discredit opponents, or about somewhat more prosaic reasons, the profit… There are too many of them. The world heard the story about the Balkan, more precisely the Macedonian teenagers who had set up a bunch of “portals” with fake news, for example that Hillary Clinton will be charged over the scandal with e-mails. Or that in 2013 she actually expressed support for Donald Trump saying that she would like to see a person like him being a presidential candidate because he is “honest and can’t be bought”.

The reason is clear – “pumping up” traffic and profit from AdSense and other advertising networks. A large part of that traffic came through Facebook, with the help of local groups and pages as well as private users who shared the news. Young Macedonians tried to do a similar thing with other presidential candidates, but with Trump they found fertile ground – as a controversial person himself, he was more “clickable” and “shareable”, bringing in more traffic and more profit.

Afterwards, Facebook promised to make it harder for such publishers to generate profit, but this still remains to be proven as an effective method.

Because…

Reporting false news is old news

Reporting false news is not a new option. But how many times you've used it so far? Was reporting successful?
Reporting false news is not a new option. But how many times you’ve used it so far? Was reporting successful?

Reporting fake news on Facebook isn’t something new – this possibility has existed for two years for sure. By introducing it they wanted to stop hoaxes, for example warnings against white or black vans abducting children or stories about drug dealers giving children drugs smelling of strawberries (which the FBI declared fake news in 2008, but this didn’t stop users of social networks as well as certain media from sharing and trying to convince people that such an incident happened the other day in their neighbourhood). In a way, this speaks about human nature for itself, but proves at the same time that Facebook has been fighting the problem for years.

Because…

How does reporting hate speech on Facebook work?

Now, let us consider the statement in the title – if you’ve ever tried to report an objectionable content on this social network, you know that it takes a lot, but really a lot of effort for such reporting to be really accepted as legitimate. Even if minors are involved, even if it’s about an unauthorized use of content, or even about cyberbullying, as it was in the unfortunate Croatian case of the Vinkovci w***** (link in Croatian).  Only after the media, the police and, perhaps the most important for Facebook, advertisers expressed their indignation, the steps were taken to remove such content, but, again, this was not efficient enough – it’s just too easy to open a new Facebook page.

Who watches the watchmen?

Let’s return to the “human” factor that re-entered the whole story on checking the content authenticity. Can people who check facts do it fast and accurate enough, even if they are experts in what they do? Can they take all factors into account in a short time? Just imagine what could all be labeled false, only because it wasn’t to the liking of someone? Am I boring you, do I talk and write too much? Label the link to this text as fake news! Who will check it in a small country such as my homeland – Croatia? Will Facebook pay local fact checkers as for example Faktograf? Will they be able to verify all information quickly and accurately before they do any damage – to the readers/viewers as well as to the publisher/author?

Because…

We live in the age of post-truth

I began my career in journalism in the print media and since I’ve entered the world of the digital I’ve been fascinated by the speed at which everything happens. But at the same time, I started to wonder whether we actually get information slower than ever. The question is a paradox if taking into account the rapid exchange of information or the fact that only a couple of minutes after something happens in the world, there is already the news about it on all of the main news portals. I’ve advocated the theory that indeed we’re exposed to a plethora of information, but it is only after some time, after a lot of new information, dismissals of the old ones, comments, summaries, analyses… that we find out what exactly is truth. If we find it at all.

Only later I learned that this theory has a name – post-truth, which became the word of the year according to the Oxford dictionary. It is defined as the “circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”.

Studies have shown that rumours travel faster than truth and that on average it takes at least 12 hours to declare a false claim false on the internet. And in 12 hours a lot can happen. Our opinion is for example already formed and we may no longer believe the objective facts, which is confirmed by the definition of post-truth.

It doesn’t help to know that mass online media, which should check information before publishing it, are “quick on the trigger” in order to get traffic, clicks, and because of these, ads, profit… And this is how (you will not believe what will happen next) we come to the last, though somehow positive point.

Hit them where it hurts the most – the pockets

All these phenomena have come too fast and in a too massive quantity that regulations, sanctions for their violation as well as people’s habits could adapt. In the media it should be clear – the publisher (editor, journalist, depending on a specific case) is responsible for the content published, but the problem lies in the content that is generated by users, readers – comments. The European Court of Human Rights confirmed that the media is responsible also for this part of the content.

What about Facebook, “the largest media without its own content”, as defined on the most frequently cited slide at all tech conferences? It seems that the decision from – Germany is going to prevail.

The strict Germany has a reason to believe that Facebook will not be able to confront the flood of fake news by itself, therefore they decided to introduce a money “incentive” – a fine of €500,000 for not removing the questionable content within 24 hours. Yes, Facebook can defend by saying that it is not responsible for the content placed on the network, but from everything mentioned above it is evident that Facebook is becoming aware of its role in influencing the formation of public opinion.

What can we do?

Today it is very difficult to verify what is true and what is false on the internet, what is worth sharing with others and what is not. So, take a deep breath before forming your opinion. Remember the advice to wait for (at least) 12 hours?

We live in fascinating times. The importance of timely and valid information has perhaps never been greater.

We live in troubled times. But then again, we don’t see the forest for the trees, we are not aware of the bigger picture. Let’s not contribute to the mess by causing an uproar ourselves.

Leave a Reply

Your email address will not be published.

Popular

Development

Through a series of developer meetups, Infobip came to the front of technological transformation in Africa

The first Croatian unicorn organized a series of successful developer meetups, which confirmed its leading position in this fast-developing continent.

Development

Infobip’s Developer Relations Team Wins Two “DevRel Oscars” in London

Development Relations team of global cloud communications platform Infobip won two prestigious awards at the DevRelCon London 2023.

What you missed

Development

Infobip’s Developer Relations Team Wins Two “DevRel Oscars” in London

Development Relations team of global cloud communications platform Infobip won two prestigious awards at the DevRelCon London 2023.

Development

Through a series of developer meetups, Infobip came to the front of technological transformation in Africa

The first Croatian unicorn organized a series of successful developer meetups, which confirmed its leading position in this fast-developing continent.

Enterprise

What are Croatian designers of creative business spaces doing in the Emirates?

Croatian consulting company DC&T, specializing in creative, efficient, and humane office space design, is setting out to conquer the world markets. First step: Dubai.

Tech

One the best European developer conferences is coming to Miami

Shift in Croatia attracted thousands of developers and thus made its way to the very top of developer conferences in Europe. Now they are ready to cross the borders of Croatia and come to the USA. Applications for Shift Miami are starting soon!

Careers

Infobip (W)intership: Not only for software developers, and not only during summer!

More than 115 mentors will work with 130 interns at 7 Infobip's regional offices in 2023 edition of their internship programme. Last year 87% of interns got - job offers.

Sponsored

How does IT Consulting create experts who know how to integrate IT into all business spheres?

Keeping in mind how much the IT industry influences all other industries but also everyday life, it's surprising how little is said about IT consulting, which is practically the bridge that connects the two. That's why we sat down with a junior and two seniors on different career paths - to explore this profession with those who practice it, love it, and live it.