Who would have thought that social media algorithms designed to serve you content you like could be a bad thing?
We live in an age where personalisation is key and content marketing platforms are working furiously to tailor our experiences to our wants and needs. The consumer has never had so much control over what is shown to them. With just a simple like on one piece of content, you trigger a platform’s detailed algorithm to increase the appearance of similar content topics.
Making the whole experience as pleasant and addictive as possible – surely everyone is winning from this?
Empathy is cultivated through exposure to different viewpoints
If your favourite social media platforms are only showing you stories from people whom you tend to agree with, you can start to become very isolated.
No longer do you need to defend your viewpoint to opposers – meaning two very scary things:
That you don’t need to analyse so much why you think the way you do.
You can start to dehumanise opposing views, as you will never have to hear valid points from someone you disagree with.
This algorithm bias is shown fiercely when important issues come to a head, such as the recent American elections, or in our own country, the political turmoil.
For the American elections, those supporting Trump ended up liking any liberal views out of their timeline – only seeing content that suited them, and vice versa for the liberals.
Fake news
This would be bad enough as a starting point to isolate groups from each other, but then comes in the people with the understanding of the very formulas serving you content, and the power to start controlling and molding that content.
Fake news is such an uncontrollable force because it is spread with these algorithms in mind. They also take advantage of the fact that people are more likely to share more sensational news online than more metered content.
For many, social media is the place to show their opinions and take a stand for something. Fake news plays perfectly into this need – no matter which side of the fence you are on.
Echo chambers
A recent study into social media behaviour found that, much like in the real world, online we tend to interact in groups of similar interests and opinions.
Makes sense, but also means that anything we share will more likely be championed rather than critically appraised. And that quick agreement means that misinformation can spread through these echo chambers like a wildfire.
Put your thinking cap back on
If this is all making you feel a little powerless, you are far from it.
As much as a platform such as Facebook is controlled by its algorithm – you are still in the drivers seat. Every click you make, every piece of content you interact with, all these actions dictate what is shown to you.
We also need to start dialling back on the culture of everything being instant – from gratification to reactions. All news from unknown sources should be verified before sharing. In an age where Twitter is most people’s main news source, we are all our own investigators. The more we all look into reports for ourselves, the better we will all be for it.
Not only that, but Facebook especially has seen the massive flaw in its current algorithm. It is now starting to institute various measures to ensure that fake news can’t spread unchecked anymore. From now on, Facebook will put disclaimers on all articles containing disputed content, alerting users to possible false reports.
Are you ready to open up the communication age again?