I’m sure you’ll be aware of Facebook and probably even aware of its news features. Like all systems it isn’t perfect, however its flaws are very public and are seen as pretty big problems.
Well one of the problems with Facebooks news is that while they have algorithms in place to priorities and order stories, even cater them to a user to some extents such as posts shared or created by pages you or your friends follow. As well as base news from data such as your internet history or other information you have fed into Facebook.
Facebook will priorities these stories based on popularity and not consider any sort of measure of truth or credibility.
So surely the popular posts that are shared and viewed the most must have some legitimacy to them? Well not necessarily statistics are very varied despite where you go but multiple sources report that around 60% of links shared aren’t even opened or read by users sharing them. So with that in mind there is a hole in this system to reward posts with catchy or misleading titles.
Well there are multiple things Facebook can look into doing; They can try to automate some sort of filter for potentially false information and hoaxes. They can allow users to help them filter out some of this content.
One approach they have tried which has been criticised is using Human moderators to try and curb this trend. However they were accused of being biased and showing favour to more liberal stories during the run up to the US election recently.
The automated approach has already been achieved by fellow tech giant google who have managed to “fact check” stories, Linking sites that fail to trusted sources.
The final approach is to allow users to help out. Now while this can open them up to some previous problems of bias or the bad judgement of blindly sharing, It does give the users control and shifts the blame to some degree.
This is reportedly currently a solution Facebook are exploring and have started to show what is essentially a small survey regarding how misleading an article and its language was, Allowing users to as a community flag “Fake” news and to filter it out.
While Facebook is not a news outlet and doesn’t itself produce these news stories, Should they take control and deal with this. Well in my opinion despite that not being what they perhaps want to do, with the growing number of people looking to Facebook for news and trusting the information they see on this site I think they should take control and do something to alleviate this problem.
However I feel they should follow in googles footsteps and work to automate this process.