The Delhi High Court on Wednesday emphasized that social media intermediaries must conduct due diligence even at the time of uploading applications, as it directed Google and Apple to remove mobile applications from their platforms that disseminate obscene and pornographic content.
Court's Observations
“We can’t permit a whole generation of the country to be ruined. We understand all kinds of freedom under Article 19 but that does not mean we allow dissemination of vulgar content,” the court observed. It reminded intermediaries that their role is not limited to blocking content only after receiving complaints.
A bench comprising Chief Justice D K Upadhyaya and Justice Tejas Karia stressed that social media intermediaries must play the “most vital role” by taking action against such apps even at the time of their upload.
Directions to Authorities
The court also instructed the Indian Computer Emergency Response Team (CERT-In), under the central government, to monitor and check the dissemination of such content. These directions were issued while hearing a Public Interest Litigation (PIL) filed by Rubika Thapa against the hosting of mobile applications offering vulgar and pornographic content on platforms run by Google and Apple.
Next Hearing
The matter has been scheduled for the next hearing on July 17.
About the Author: Abhinav Garg, as legal editor for Delhi, handles coverage of courts and connected legal challenges shaping the capital. With over two decades of experience, he breaks down complex legal jargon and simplifies how verdicts or developments in courts impact readers.



