
The regulation of online content in India has once again come into focus with the recent advisory issued by the Ministry of Electronics and Information Technology (MeitY) directing social media platforms to proactively take down obscene and pornographic content. The advisory reiterates the statutory obligations of intermediaries under Indian law and signals a stricter approach towards digital content moderation, particularly by large social media platforms and OTT services.
Background of the Advisory
The advisory has been issued in the backdrop of increasing concerns over the availability and circulation of obscene, sexually explicit, and harmful content on digital platforms. According to the government, several platforms were found to be inadequately screening such content, despite existing legal obligations. The issue has also drawn judicial attention, with the Supreme Court of India urging the Union Government to take effective steps to curb online obscenity, especially content that may be harmful to children and public morality.
Legal Framework: IT Rules, 2021
The advisory derives its authority from the Information Technology Rules, 2021, framed under the Information Technology Act, 2000. These Rules impose due diligence obligations on intermediaries, including social media platforms, to ensure that prohibited content is not hosted or circulated through their services.
Under the Rules, intermediaries are required to make “reasonable efforts” to ensure that users do not upload, publish, transmit, store, or share content that is obscene, pornographic, sexually explicit, paedophilic, or otherwise unlawful. Importantly, the Rules mandate that large social media platforms—defined as those having more than fifty lakh registered users in India—must deploy technological tools to proactivel identify and remove such objectionable content.
Obligations of Social Media Platforms
The advisory emphasizes that compliance with the IT Rules is not optional. Platforms are expected to adopt automated detection mechanisms, improve content moderation practices, and respond swiftly to unlawful content. Failure to comply may attract legal consequences, including the loss of intermediary safe harbour protection under Section 79 of the IT Act. Once such protection is withdrawn, platforms may be held liable for third-party content hosted on their services.
The advisory also clarifies that intermediaries must not permit the hosting or dissemination of content that is prohibited “under any law for the time being in force,” thereby linking digital compliance directly with existing criminal and regulatory laws governing obscenity.
Judicial Context and Policy Direction
The timing of the advisory is significant. It comes shortly after judicial observations highlighting the absence of effective regulatory mechanisms for certain digital platforms, particularly those specialising in erotic or sexually explicit content. While the Supreme Court has not laid down a new definition of obscenity, its concern has prompted the executive to reinforce the existing legal framework.
This development reflects a broader policy direction where the government is increasingly relying on delegated legislation and advisories to regulate the digital ecosystem, rather than introducing fresh primary legislation.
Implications for Digital Platforms and Users
For social media platforms and OTT services, the advisory signals enhanced scrutiny and a lower tolerance for non-compliance. It necessitates investment in stronger content moderation infrastructure and clearer enforcement of community guidelines. For users, the advisory may result in stricter content removals and reduced circulation of explicit material, though it also raises concerns about over-filtering and the scope of discretionary moderation by platforms.
Conclusion
The MeitY advisory serves as a reminder that India’s digital space is governed by an evolving regulatory framework that seeks to balance technological growth with legal accountability. By reiterating the obligations under the IT Rules, 2021, the government has reinforced the expectation that intermediaries act responsibly in preventing the spread of obscene and unlawful content. How effectively these directions are implemented—and how they impact free expression online—will be closely watched in the coming months.
