The Delhi High Court on Wednesday expressed serious concern over the proliferation of obscene and vulgar content on mobile applications, stating that the entire generation cannot be allowed to be ruined by such material. The court was hearing a Public Interest Litigation (PIL) filed by Rubika Thapa, which sought action against tech giants Google and Apple for hosting mobile applications that offer pornographic and obscene content on their respective platforms.
Court's Observations
During the proceedings, the bench comprising Chief Justice Satish Chandra Sharma and Justice Sanjeev Narula remarked that the availability of such content on mobile apps is a matter of grave concern, especially for the younger generation. The court noted that children and adolescents are particularly vulnerable to such material, which can have a detrimental impact on their mental and emotional development.
The PIL, filed through advocates, alleged that despite existing laws and guidelines, both Google and Apple continue to host applications that provide easy access to vulgar and pornographic content. The petitioner argued that this violates the rights of children and adolescents under the Constitution of India and various international conventions.
Legal Framework
The court referred to the Information Technology Act, 2000, and the rules framed under it, which prohibit the publication or transmission of obscene material in electronic form. The bench also mentioned the guidelines issued by the Ministry of Electronics and Information Technology (MeitY) for intermediaries, which require platforms to take down illegal content promptly.
However, the court observed that despite these provisions, the enforcement remains weak, and tech companies often fail to comply with the regulations. The bench directed the central government to file a detailed affidavit explaining the steps taken to regulate such content and to ensure that intermediaries adhere to the law.
Response from Google and Apple
Both Google and Apple, represented by their legal counsels, submitted that they have robust content moderation policies in place. They claimed that apps violating their guidelines are promptly removed from their stores. However, the court expressed dissatisfaction with the current measures, stating that the problem persists on a large scale.
The bench questioned the effectiveness of the self-regulatory mechanisms adopted by these companies and emphasized the need for stricter oversight. The court also suggested that the government might consider creating a dedicated regulatory body to monitor and control the distribution of obscene content through mobile apps.
Next Steps
The court has listed the matter for further hearing on May 27, 2026, and has asked the central government to submit a compliance report. Additionally, the court directed the Department of Telecommunications (DoT) and MeitY to coordinate with internet service providers (ISPs) to block access to apps that are found to be hosting obscene content.
The bench also recommended that the government explore technical solutions, such as age verification mechanisms, to prevent minors from accessing such applications. The court concluded by reiterating that the protection of children and adolescents from harmful content is a paramount responsibility of the state and all stakeholders.



