Scroll to Top

Need to Regulate Abusive Online Content: Supreme Court

  • The Supreme Court has highlighted the urgent need to regulate abusive and harmful online content.
  • The Court emphasized creating an effective mechanism to ensure accountability for user-generated online content.

Concerns Raised by the Supreme Court

  • The Court noted that harmful posts spread with extreme virality, meaning they reach large audiences within seconds.
  • The Court also noted that the high velocity of online posts makes takedown mechanisms ineffective once damage occurs.
  • The Court observed that mere one-line adult-content warnings do not prevent minors from accessing explicit material.
  • The Court highlighted that user-generated channels operate without regulatory oversight.
  • The Court noted that such unregulated channels allow unverified or provocative content to spread unchecked.
  • The Court cautioned that misinformation becomes dangerous when it is used to incite hatred.
  • The Court further cautioned that misinformation can distort facts.
  • The Court warned that misinformation can also trigger social unrest, even if dissent itself is democratic.

Proposals and Directions by the Supreme Court

  • The Court directed the Centre to draft new guidelines for all forms of digital content.
    • These guidelines will cover User Generated Content (UGC) on social media.
    • These guidelines will also cover OTT platforms, news content, and curated online content.
    • The Court directed that these guidelines must be framed after public consultation.
  • The Court suggested establishing an autonomous regulator for digital content.
    • The Court recommended that this regulator must be neutral and independent.
    • The Court stated that this regulator should replace or supplement existing self-regulatory mechanisms.
  • The Court directed the formation of an expert committee to study online content regulation.
    • The Court stated that this committee must include domain experts.
    • The Court also stated that this committee must include persons with judicial background.
  • The Court proposed an age verification system for access to adult content.
  • The Court suggested Aadhaar-based verification as one method of age verification.
  • The Court suggested PAN-based verification as another method of age verification.
  • The Court stated that age verification must go beyond mere disclaimers.

Existing Legal Mechanisms

  • The Information Technology Act, 2000 regulates online intermediaries and digital content.
    • The IT Rules 2021 require intermediaries to appoint grievance redressal officers.
    • The IT Rules also mandate time-bound removal of unlawful online content.
    • The IT Rules further require due diligence from digital intermediaries.
  • In October 2025, the government proposed rules to ensure AI-generated content is clearly labelled.
    • These rules require intermediaries to verify AI-generated content before upload.
  • The Bharatiya Nyaya Sanhita (BNS), 2023 criminalises defamation and obscenity.
    • BNS also criminalises sedition-like acts.
    • BNS further criminalises incitement to violence.
  • Digital platforms follow self-regulatory codes for content moderation.
  • OTT platforms maintain internal guidelines for content classification and moderation.
  • Broadcasters use rating systems and moderation procedures to regulate content.
  • Victims can seek judicial remedies after harm occurs.
    • Judicial remedies include damages for harm suffered.
    • Judicial remedies also include injunctions to prevent further harm.
    • Judicial remedies further include criminal proceedings against offenders.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top