Compliance through bans

March 2, 2025

Social media bans have done little to ensure meaningful accountability for the platforms

Compliance through bans


A

year on from X (formerly, Twitter) being banned in Pakistan, we are no closer to clarity on why the platform continues to be blocked in the country. From the Pakistan Telecommunication Authority to the Ministry of Interior, the authorities have provided no rationale apart from vague justifications of national security. These non-explanations are indicative of the wide and discretionary powers given to the authorities to block and remove content under the law as well as the failure of our courts to impose any guardrails on these powers.

Pakistan’s practice of banning entire social media platforms is not new. Authorities have blocked TikTok four times during its short lifespan; YouTube was inaccessible to users in Pakistan for over three years; even Wikipedia was blocked briefly in 2023. The common denominator in all these moves has been the fact that wholesale bans have been largely accepted by our legal system as an acceptable tool for content moderation. Specific pieces of content shared on platforms are cited as reason enough to block the entire platform. Anyone who has ever used a social media platform like X knows that its use is incredibly varied, to the point that many of us refer to which ‘part’ of the internet we are on. Our timelines, fuelled by algorithms designed to curate and personalise content, are not a monolith and show the myriad ways in which these platforms can be used. While used ad nauseam within free speech jurisprudence, the lack of proportionality is adequately illustrated through the analogy of blocking entire highways or roads because there are accidents on them.

This mode of regulation of outright and wholesale bans on entire platforms is also indicative of the lack of nuance in discussions regarding social media platforms. Social media platforms and the companies that own them have amassed immense power and are largely unaccountable. The platforms are owned by a handful of companies: WhatsApp, Facebook and Instagram are owned by Meta, Google and YouTube are owned by Alphabet; X is owned by Tesla billionaire Elon Musk; and TikTok is owned by ByteDance. This concentration of power and largely unaccountable ways in which these platforms operate are serious problems. There is now immense evidence to understand the deleterious impact of these platforms, left to run without any transparency or accountability, through monetisation of private data, biases in content moderation and amplification of harmful content. With the popularisation of generative AI through the near ubiquitous use of ChatGPT, these issues will only rise.

Compliance through bans

Countries like Pakistan, however, have sought to take a sledgehammer to the internet by employing social media bans and stringent criminal laws which have inevitably done little to ensure meaningful accountability for these platforms. Earlier this year, the back and forth with TikTok in the US, a proposed ban justified on the basis of privacy and national security concerns due to the app being of Chinese origin, was no more than a bullying tactic to ensure compliance from the platform. The intention of the ban was not structural change to ensure security of personal data of US users. Had that been the case, the US would have applied the ban to all platforms or passed comprehensive legislation securing privacy of digital data and information and dismantling the profit models of tech companies. Instead, it became obvious when US President Donald Trump lifted the ban that the intention was not to change the ways in which tech companies operate or to make them more accountable to users, it was to ensure compliance and amenability to US interests. Ultimately, regulation leaning towards censorship does not change the fundamental ways in which tech platforms operate. Rather, it seeks to change the shape of these broken systems to bend towards a newer locus of control.

The Pakistan government hopes to achieve similar results when it bans platforms. The ban on YouTube was lifted in 2016 because the government was able to negotiate a localised version of the platform to allow for geo-specific content removal requests. After multiple TikTok bans, the company now works closely with the government to assuage its concerns. In July last year, the Minister of Information and Broadcasting told the Senate that X had been banned for failure to comply with content removal requests of the government. If X bends the knee and complies, it is likely that the ban will be lifted. Recent amendments to the Prevention of Electronic Crimes Act contain provisions requiring social media platforms to enlist with the to-be-created Social Media Protection and Regulatory Authority. The intention is clear: the government does not have a problem with social media platforms, only with the ones that fail to work with it.

It is unlikely that Pakistan will ever have a real reckoning regarding social media platforms and their unchecked powers. We seem to be going in circles for the last 15 years. New laws are introduced to dress up old mechanics of control. Global conversations regarding accountability for social media are fragmented and often flawed. Pakistan would do well to listen to digital rights activists who have been saying for years that the tilt of regulation should not be towards governments or privately owned companies. They must prioritise the users who make these platforms vibrant and diverse spaces in the first place.


The writer is a researcher and campaigner on human and digital rights issues

Compliance through bans