The internet bridges distances and has the potential to significantly improve our lives in many ways. Recently, however, the prevalence of a dangerous phenomenon called ‘post-truth’ – in which appeals to emotion shape public opinion more than objective facts – and the rise of ‘fake news’, particularly on social media, have raised various concerns. It is intriguing to see what might be causing them.
The first possible cause is the absence of editorial scrutiny by social networks. Lots of people use these networks as their primary news source and the absence of editorial oversight is a major issue that fuels fake news. This is not an easy problem to solve. For one, how many editors would be required to help fix a product used by billions? What if the fake story has already reached thousands of users (including well-connected ones) before editorial scrutiny begins? How would editors undo the damage done? Regardless of these issues, editorial scrutiny and oversight are sorely needed.
Another issue enabling post-truth is the low barriers to entry for starting a news business on social media. All you need is content on Facebook and a website with advertisements to generate revenue. For example, during the recent US election season, some people running politically-themed Facebook pages were basically just marketeers operating from their basements with low capital investment and without any formal training in ethical journalism. These marketeers post clever ‘linkbait’ headlines to entice users to click on them. The absence of ethical standards and motivation for profit make it possible to manufacture controversy.
The aggressive collection of data and its manipulation in a cost-effective and precise manner by users and marketers on social networks further contribute to the rise of post-truth. Powerful computational algorithms make it possible to infer and predict a wide range of information about individuals using this collected data. Research studies conducted in 2013 by an assistant professor of organisational behaviour at Stanford Graduate School of Business, Michal Kosinski, revealed that just by using Facebook “likes”, it is possible to accurately predict personal attributes such as “ethnicity, religious views, political views, personality and mood, age and even gender”. Once these predictions are in place, it is easy to target the profiled users with tailored, potentially fake, news.
It’s also interesting to note that polarisation can make people voice extreme opinions online while totally ignoring facts. Cass Sunstein of Harvard University has shown that there is a great likelihood of acute polarisation on the internet due to the high degree of anonymity. We see this on social networks everyday – intensely polarised groups read and share news stories that appeal to emotion as opposed to relying on logic and fact-checking. This polarisation is further aggravated by algorithmic biases. For example, Facebook’s news feed algorithm shows us only what we want to see, exposing us to a narrow spectrum of opinion (tailored to our liking), exacerbating polarisation.
The effects of social influence and information cascades further propel post-truth. Social influence theory suggests we do what we do by imitating what others do. For example, research suggests that to increase tax compliance, people should be informed about the high levels of voluntary tax compliance by others. Information cascades occur when we rely on information spread by others while ignoring our own personal knowledge. People often re-share a news story without fact-checking – to the exclusion of their own personal feelings and knowledge about the news item – when they see more (credible) people among them sharing it.
So what do we do? One, social networks, like Facebook, should not deviate from their original mission – which is to connect people, not promote false information (and even hatred) among them.
Social networks should also slightly raise the barrier to entry for upcoming businesses by adding more checks and balances.
Furthermore, these networks must tweak their opaque algorithms so that they don’t just keep feeding one type of news to readers. Sunstein suggests that exposure to opposing views is important for groups to depolarise.
Social networks need at least some human editors to check for the validity of both page content and its sources. In Pakistan, we see lots of Facebook pages with the names “Hate [insert politician name here]”. Compared to other communication media, the most ad-hominem attacks per paragraph appear on these pages. They don’t serve any useful purpose and need to be taken down.
We should understand how much information we, as users, are feeding social networks and how much power this gives them to predict our personality traits, future views and actions.
Lastly, we should realise that the internet is a medium where polarisation, social influence and information cascades are at play. We should always fact-check news (since others might rely on us) and try to expose ourselves to a wide range of opposing views, knowing that some social media companies are suppressing that exposure.
Email: wyounas@lumsalumni.pk
Data, today, defines how we make decisions with tools allowing us to analyse experience more precisely
But if history has shown us anything, it is that rivals can eventually unite when stakes are high enough
Imagine a classroom where students are encouraged to question, and think deeply
Pakistan’s wheat farmers face unusually large pitfalls highlighting root cause of downward slide in agriculture
In agriculture, Pakistan moved up from 48th rank in year 2000 to an impressive ranking of 15th by year 2023
Born in Allahabad in 1943, Saeeda Gazdar migrated to Pakistan after Partition