India's Approach to Regulating Social Media

India does not support an unfettered right to freedom of speech. The Constitution permits reasonable restrictions on the exercise of this right in the interests of sovereignty, security, public order, decency, or morality. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (the Intermediary Guidelines) permit a social media intermediary (including commonly used social media platforms like Facebook, X, Instagram, and TikTok) to remove information that violates its rules and regulations. This includes information that:

  • is obscene, pornographic, paedophilic, invasive of another's privacy, including bodily privacy, insulting or harassing based on gender, racially or ethnically objectionable, or promoting enmity between different groups on the grounds of religion or caste with the intent to incite violence;
  • is harmful to children;
  • infringes any patent, trademark, copyright, or other proprietary rights;
  • deceives or misleads the addressee about the origin of the message or knowingly and intentionally communicates any misinformation or information that is patently false and untrue or misleading in nature, or, in respect of any business of the Central Government, is identified as fake or false or misleading by the Central Government;
  • threatens the unity, integrity, defence, security, or sovereignty of India, its friendly relations with foreign states, or public order, or incites the commission of any cognisable offence, or prevents investigation of any offence, or is insulting towards other nations; and
  • violates any law in force at that time.

The Intermediary Guidelines also provide a mechanism for grievance redressal but the detailed process for addressing complaints and the extent to which such decisions need to be justified have not been specified.

The problem is not the guidelines themselves but their enforcement and the transparency with which they will be followed. Unsurprisingly, the Government can also direct a social media intermediary to remove information on any of the grounds stated above. This also extends to content removal on the grounds of decency and morality, which is consistent with the actions of other governments around the world. The assault on Rushdie's stated cause is complete. As an aside, India was one of the first countries to ban The Satanic Verses well before anyone in India had even read it!

The EU Alternative

The EU has recently enacted two important pieces of legislation in an attempt to control the unruly Internet – the Digital Services Act (DSA) and the Digital Market Act (DMA) that allow Internet and social media intermediaries to regulate Internet content. The DSA and DMA also allow the European Commission to conduct market investigations and have defined remedies if the firms (i.e. entities specified by the DSA and DMA) fall out of line.

DMA
The DMA, which came into force on 1 November 2022, is one of the first tools to comprehensively regulate the 'gatekeeper' power of the largest digital companies. The DMA has defined a 'gatekeeper' as a large digital and core
platform service provider such as online search engines, app stores, and messaging services, and has mandated their compliance with the obligations and prohibitions listed in the DMA.

DSA
The DSA,1 which was enacted on 17 February 2024, regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms. Its main goal is to prevent illegal and harmful online activities and the spread of disinformation, with the intent to ensure user safety, protect fundamental rights, and maintain a fair and open online platform environment.

From our vantage point, the most exciting and noteworthy aspect of this legislation is the protection of users' fundamental rights online, including freedom of speech. It also strengthens the public oversight mechanism for online platforms, in particular for the Very Large Online Platforms (VLOPs) that reach more than 10% of the EU's population.

Under the DSA, providers of intermediary services, including online platforms, must communicate to their users why they have removed their content, or why access to an account has been restricted. Providers of hosting services, including online platforms, now have an express legal obligation to provide clear and specific statements of reasons for their content moderation decisions. The DSA also empowers users to challenge such decisions through an out-of-court dispute settlement mechanism.

According to the EU, even online platforms like Facebook and Instagram, which had already been sharing their content moderation decisions, now offer a greater range of content moderation information with the advent of the DSA.

The DSA isn't limited to just the obligation to provide reasons. To create precedents and ensure consistency of behaviour, the European Commission has launched a DSA Transparency Database, a first-of-its-kind database that
makes accessible to the public all statements of reason offered by providers of online platforms for their content moderation decisions. A public database is important because decisions made by social media intermediaries need to
be made publicly available to assist both individuals who wish to post content on these platforms as well as moderators on these platforms with an objective yardstick by which to make and justify their decisions.

https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act_en.

Lessons for India

We need to amend the Intermediary Guidelines in India and be faithful to the constitutional objective of imposing only reasonable restrictions on the freedom of speech, imagination, and expression. The reasons for any content moderation decisions made by social media intermediaries or the Government must be set out clearly in writing and made publicly accessible. The removal of content on the grounds of indecency or immorality needs to be justified. Only then will the grievance and appellate mechanisms envisaged by the Intermediary Guidelines achieve their full potential.

While social media platforms are powerful tools for the dissemination of information and ideas, they need to be regulated, not just to protect freedom of speech but also to prevent trolls and other unsavoury characters from using
the platforms to express bigoted and ignorant thoughts. We must reclaim social media as a channel for the expression of irreverent and satirical ideas without fear of recrimination and trolls.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.