Global Law Experts Logo
tmt lawyers india

TMT Lawyers India 2026: IT Rules, SGI Labelling, 3‑hour Takedowns & Intermediary Liability

By Global Law Experts
– posted 3 hours ago

India’s Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026 represent the most consequential overhaul of platform regulation since the original 2021 intermediary guidelines. For TMT lawyers India-wide, and the in-house counsel, compliance leads, product managers and founders they advise, the new rules create three urgent operational demands: mandatory labelling of synthetic and AI-generated information (SGI), a dramatically compressed 3-hour takedown window for unlawful content, and expanded due diligence and record-keeping duties that reshape intermediary liability. This guide delivers the step-by-step compliance playbook, sample clauses, runbooks and checklists that platforms need to implement the IT Rules 2026 defensibly and at speed.

At-a-Glance Summary and Key Action Checklist

The IT Amendment Rules 2026, notified by MeitY, require every significant social media intermediary (SSMI) and regulated intermediary to label all SGI with persistent, tamper-proof metadata; remove or disable access to notified unlawful SGI within three hours; and satisfy strengthened due diligence, grievance redressal and audit obligations. Failure to comply risks loss of the safe-harbour protection under Section 79 of the Information Technology Act, 2000.

Three immediate compliance priorities:

  • Label all SGI. Deploy user-facing labels and embed machine-readable metadata at the point of publication. Ensure anti-tampering safeguards prevent removal or modification of the label, metadata or unique identifier.
  • Stand up a 24/7 rapid-response takedown operation. Build or retool content-moderation workflows to meet the 3-hour removal SLA for notified unlawful content, down from 36 hours under the previous rules.
  • Update contracts, policies and audit trails. Revise user terms and conditions, AI-vendor SLAs, and internal record-keeping procedures to reflect expanded due diligence and traceability requirements.

What the IT Amendment Rules 2026 Change, Legal Provenance

The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026 amend the parent 2021 rules issued under the Information Technology Act, 2000. The amendment was notified by MeitY and introduces a new regulatory layer specifically targeting AI-generated content and deepfakes, while simultaneously tightening general intermediary compliance timelines across the board.

Scope: Who Is Covered?

The rules apply to all intermediaries operating in India, with heightened obligations for SSMIs, platforms that meet the user-base thresholds designated by MeitY. Digital news media entities and OTT platforms already regulated under Part III of the 2021 rules also face updated content moderation obligations where SGI is concerned. The practical effect is that any platform permitting user-generated content, social media, messaging, video-sharing, e-commerce reviews, or generative-AI tools serving Indian users, must evaluate its position under the amended framework.

Key Regulatory Milestones

Milestone Date / Status Significance
IT (Intermediary Guidelines) Rules, 2021, original notification 25 February 2021 Established baseline intermediary due diligence, grievance officer and SSMI obligations
Successive amendments (2022–2023) 2022–2023 Introduced fact-check unit references and additional compliance timelines
IT (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, notification by MeitY February 2026 Introduced SGI labelling, 3-hour takedown, expanded due diligence and anti-tampering obligations

The IT Amendment Rules 2026 do not replace the 2021 framework; they layer additional duties on top of it. Platforms that were already compliant with the 2021 rules must now address the incremental SGI and accelerated-takedown requirements.

SGI Labelling Obligations, Definition, Metadata and Anti-Tampering

At the heart of the IT Rules 2026 is the concept of Synthetic or Generated Information (SGI). Under the rules, SGI encompasses any content, text, image, audio, video or any combination, that is substantially generated, modified or manipulated using software, algorithms, machine learning, large language models or other computational techniques, such that it could reasonably be mistaken for organic, human-created content.

What TMT Lawyers India Practitioners Need to Know About Labelling Requirements

SSMIs must require users uploading content to declare whether that content is SGI. Platforms must then deploy technical measures to verify the accuracy of such declarations and, upon confirmation that the content qualifies as SGI, apply a clear and prominent label indicating its synthetic origin. The labelling obligation operates at three levels:

  • Visual/user-facing label. A conspicuous on-screen indicator (badge, watermark overlay, or text label) visible to the end viewer at the point of consumption, in feeds, search results, video players and audio players alike.
  • Machine-readable metadata. Structured metadata embedded in the content file or platform record, including at minimum the generating tool or model identifier, the generation or modification date, and a unique content identifier that persists across shares and re-uploads.
  • Unique identifier / provenance trail. A persistent identifier enabling downstream traceability, so that even if content migrates between platforms, its synthetic origin can be verified.

Sample SGI Label Text

Platforms have discretion over exact wording, but the label must be unambiguous. Industry observers expect most platforms to adopt tiered approaches:

  • Short (in-feed badge): “AI-Generated” or “Synthetic Content”
  • Medium (content detail page): “This content was generated or substantially modified using AI tools. See metadata for details.”
  • Full (metadata/info panel): “Synthetic / Generated Information (SGI), Generated by [Tool/Model], Date: [YYYY-MM-DD], Unique ID: [identifier]. This content is subject to India’s IT Amendment Rules 2026.”

Anti-Tampering and Label Integrity

Under the anti-tampering safeguard in the rules, the SGI label, associated metadata and unique identifier must not be modified, suppressed or removed, either by the uploading user, by downstream sharers, or by the platform itself. Platforms must implement technical controls (cryptographic hashing, watermarking, or content-authenticity infrastructure) to detect and prevent tampering. Where a platform detects that metadata has been stripped or altered, it is required to flag the content for review and, where warranted, restrict its distribution pending re-verification. The practical burden here is significant: engineering teams must integrate anti-tampering checks into content ingestion pipelines, CDN delivery layers and API outputs.

3-Hour Takedown Rule, Scope, Triggers, Exceptions and Operational Runbook

The most operationally disruptive element of the IT Rules 2026 is the reduction of the compliance window for content removal from 36 hours to just 3 hours for certain categories of notified content. This 3-hour takedown rule applies when a platform receives a valid notification, from a government authority, court order, or authorised complainant, identifying content that is unlawful, harmful or violates the rules’ SGI provisions.

What Triggers the 3-Hour Clock?

  • Government or court-ordered takedowns: Directions issued under Section 69A of the IT Act or by a competent court.
  • Non-consensual intimate imagery and deepfakes: Content depicting individuals in fabricated intimate or degrading scenarios without consent, identified as a priority enforcement category.
  • Impersonation via SGI: Synthetic content that impersonates a real, identifiable person without authorisation.
  • Content flagged by the Grievance Appellate Committee (GAC): Escalated complaints routed through the appellate process where the GAC directs removal.

For particularly sensitive content, such as non-consensual deepfake nudity or impersonation, early indications suggest that the practical compliance expectation may be even shorter, with some analyses noting timelines as tight as two hours for the most egregious material.

3-Hour Takedown Runbook

Action Responsible Team SLA (Time from T₀)
Receive and log valid notice (T₀) Trust & Safety / Legal Ops intake queue 0 minutes
Triage: classify content type and urgency tier Trust & Safety lead on duty ≤ 15 minutes
Verify notice validity (authority, jurisdiction, specificity) Legal counsel (on-call) ≤ 30 minutes
Locate content across platform surfaces (original + re-shares) Engineering / Content-ID systems ≤ 45 minutes
Execute takedown (disable access, geo-block, or remove) Engineering / Moderation ops ≤ 90 minutes
Log takedown action: timestamp, content hash, notice reference, decision rationale Compliance / Legal Ops ≤ 120 minutes
Notify affected user(s) of takedown and appeal rights User operations / Automated system ≤ 150 minutes
Final compliance confirmation and evidence preservation Compliance officer ≤ 180 minutes (3 hours)

Sample Internal Incident Ticket Fields

Platforms should standardise internal incident tickets for takedown requests. The likely practical fields include:

  • Ticket ID: Auto-generated unique reference
  • Notice source: Government authority / Court / GAC / Authorised complainant
  • Content URL(s): All known instances
  • Content type: Deepfake / Impersonation / Intimate imagery / Other SGI
  • Urgency tier: Priority (2-hour) / Standard (3-hour)
  • Takedown timestamp: Actual removal time
  • Evidence hash: Cryptographic hash of preserved content
  • User notification status: Sent / Pending / N/A

Sample Takedown Acknowledgement Notice

“We have received a valid notice dated [DATE] from [AUTHORITY] directing the removal of content at [URL]. In compliance with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, access to this content has been disabled within the prescribed timeline. If you believe this action is in error, you may file an appeal through [GRIEVANCE MECHANISM].”

Intermediary Liability India, Due Diligence, Audits, Record-Keeping and Traceability

Section 79 of the Information Technology Act, 2000 grants intermediaries a conditional safe harbour from liability for third-party content, but only where they observe “due diligence” as prescribed. The IT Amendment Rules 2026 substantially expand what qualifies as due diligence, and the consequences of non-compliance are correspondingly severe: loss of the safe-harbour shield, exposing the platform to civil and criminal liability for the content it hosts.

Expanded Due Diligence Obligations

Under the 2026 rules, platforms must now satisfy several additional due diligence requirements beyond those introduced in 2021:

  • SGI verification pipeline: Implement technical systems to verify user declarations about whether content is SGI and to detect undeclared SGI using detection tools.
  • Metadata preservation: Retain SGI labels, metadata and unique identifiers for the duration prescribed by the rules, ensuring they are producible on demand to regulators or courts.
  • Traceability of first originator: For SSMIs providing messaging services, the existing obligation to identify the first originator of information in response to a lawful order is reinforced and extended to SGI specifically.
  • Periodic compliance audits: SSMIs must conduct or commission periodic audits of their SGI labelling, takedown and grievance systems and make audit reports available to MeitY upon request.

Compliance Matrix by Entity Type

Obligation / Requirement Applies To Timeline / SLA
SGI labelling (label + metadata + tamper-proof retention) All SSMIs and regulated intermediaries hosting SGI At time of publication; persistent metadata with anti-tamper retention per rules
Emergency takedown for unlawful deepfakes and impersonation SSMIs and large platforms Remove within 3 hours of valid notice
Audit and record-keeping Designated intermediaries and SSMIs Retention windows per rules; records producible on request
First-originator traceability SSMIs with messaging functionality On lawful order from court or authorised authority
Grievance officer appointment and response timelines All intermediaries Acknowledge within 24 hours; resolve within 15 days (standard) or per accelerated timelines for SGI

The practical effect of these expanded requirements is that intermediary liability in India is now more tightly coupled to a platform’s demonstrable investment in compliance infrastructure. Platforms that can evidence robust SGI detection, labelling, audit and takedown systems will be in a materially stronger position to defend their safe-harbour status.

Grievance, Notice and Appellate Procedures

The 2026 rules reinforce the three-tier grievance architecture first established in 2021 and add SGI-specific procedural requirements that TMT lawyers India practices must embed in platform operations.

Step-by-Step Grievance Flow

  1. Receipt and acknowledgement. The platform’s Grievance Officer must acknowledge any user complaint within 24 hours of receipt.
  2. Investigation and resolution. Standard complaints must be resolved within 15 days. Complaints relating to SGI, particularly those alleging non-consensual deepfakes or impersonation, attract the accelerated 3-hour takedown SLA where the complaint constitutes a valid notice.
  3. User notification. The complainant and the content creator must be notified of the action taken and the reasoning behind it.
  4. Appellate escalation. If either party is dissatisfied, they may escalate to the Grievance Appellate Committee (GAC). The GAC’s decision is binding on the intermediary, which must comply within the prescribed timeline.

Sample User Grievance Response Template

“Dear [User], your complaint reference [REF] has been reviewed by our Grievance Officer. After assessment, the reported content has been [removed / retained with label / retained without action] for the following reason: [REASON]. If you disagree with this decision, you may appeal to the Grievance Appellate Committee at [GAC PORTAL]., [Platform Name] Grievance Team”

Platforms should ensure that all grievance interactions, receipt, investigation steps, decisions and notifications, are logged with timestamps and preserved as part of the compliance audit trail.

Technical and Product Implementation Checklist

Translating legal obligations into engineering deliverables requires a structured platform compliance checklist. The following items represent the minimum viable compliance posture for SSMIs and large intermediaries under the IT Rules 2026.

SGI Detection and Labelling Stack

  • User declaration intake: Add a mandatory SGI declaration toggle/checkbox at the point of content upload across all surfaces (web, mobile, API).
  • Automated SGI detection: Integrate computer-vision, audio-analysis and text-classifier models to flag content likely to be synthetic, supplementing user declarations with platform-side verification.
  • Label rendering: Implement front-end components to display SGI badges in feeds, detail pages, video players and embedded/shared views.
  • Metadata schema: Define a structured metadata schema (e.g., JSON-LD or C2PA-compatible) recording generating tool, generation date, unique identifier, uploader declaration and verification status.
  • Anti-tampering layer: Apply cryptographic hashing or digital signatures to SGI metadata at ingestion; implement pipeline checks that detect and block metadata-stripped re-uploads.

Takedown Response Infrastructure

  • 24/7 on-call rota: Staff a Trust & Safety / Legal Ops on-call rotation covering all time zones, with authority to execute takedowns without senior-management approval for qualifying notices.
  • Content-matching system: Deploy hash-matching or fingerprinting tools to locate re-shares and copies of content targeted for takedown across the platform’s surfaces.
  • Incident ticketing: Configure ticketing systems with the fields outlined in the runbook above, including automatic SLA timers and escalation triggers.
  • Evidence preservation: Implement automated snapshotting and cryptographic hashing of content at the point of takedown, stored in a compliance-only data store with access controls.

Audit and Retention

  • Audit trail logging: Record every SGI label application, modification attempt, takedown action, grievance decision and metadata query in an immutable log.
  • Retention policy: Align data retention periods with the rules’ requirements; ensure that SGI metadata and takedown records are retained for at least the mandated window and are producible on request in a machine-readable format.
  • Periodic audit scheduling: Establish a compliance audit calendar (industry observers suggest quarterly at minimum for SSMIs) and engage independent auditors with TMT and information-security expertise.

Contractual and Commercial Mitigations, Sample Clauses and Negotiation Guidance

Legal compliance extends beyond technical systems. Platforms must update their contractual frameworks to allocate risk appropriately and to create enforceable obligations on users, content creators and AI-technology vendors.

Sample Clause 1, User Terms: SGI Declaration Warranty

“You represent and warrant that, when uploading Content, you will accurately declare whether such Content constitutes Synthetic or Generated Information (‘SGI’) as defined under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026. Failure to make an accurate declaration constitutes a material breach of these Terms and may result in content removal, account suspension or termination.”

Sample Clause 2, AI Vendor SLA: Metadata and Labelling Obligations

“Vendor shall ensure that all outputs generated by the AI System include embedded metadata compliant with the Platform’s SGI Metadata Schema, including the model identifier, generation timestamp and a unique content identifier. Vendor shall not deploy any mechanism to strip, obscure or alter such metadata. Vendor shall maintain systems availability of [99.9]% for metadata-generation services, measured monthly.”

Sample Clause 3, Indemnity

“Content Creator shall indemnify and hold harmless Platform against any claims, losses, penalties or regulatory actions arising from Content Creator’s failure to accurately declare SGI status, or from Content Creator’s removal, modification or suppression of SGI labels or metadata in breach of these Terms or Applicable Law.”

Sample Clause 4, Audit Right

“Platform reserves the right to audit Vendor’s SGI labelling, metadata-generation and anti-tampering systems no more than [twice] per calendar year, upon [30] days’ prior written notice. Vendor shall provide reasonable access to relevant systems, logs and documentation. Audit findings indicating non-compliance shall be remediated by Vendor within [15] business days of notification.”

Negotiation Tips for Platform Counsel

  • Redline vendor liability caps carefully. AI vendors may resist unlimited liability for metadata failures; negotiate carve-outs ensuring that regulatory fines and loss of safe-harbour protection are excluded from any cap.
  • Require real-time metadata API access. If the vendor hosts the generative model, insist on API-level access to metadata fields at the point of generation, not as a batch process.
  • Build termination triggers around compliance. A vendor’s persistent failure to embed compliant metadata should constitute a material breach entitling the platform to terminate without penalty.

Enforcement, Penalties and Risk Calibration

The 2026 rules do not introduce a standalone monetary-penalty schedule comparable to the EU’s Digital Services Act. Instead, the primary enforcement mechanism remains the withdrawal of safe-harbour protection under Section 79 of the IT Act. A platform that fails to comply with the prescribed due diligence, including SGI labelling and takedown timelines, is treated as if it were the publisher of the offending content, exposing it to the full spectrum of civil and criminal liability.

Risk-Based Prioritisation: 30/60/90-Day Roadmap

  • Days 1–30: Activate the 3-hour takedown runbook and 24/7 response rota; update user-facing terms to include the SGI declaration warranty; publish an updated grievance-officer contact and procedure page.
  • Days 31–60: Deploy SGI labelling on at least the primary content-upload surface; integrate automated SGI detection for the highest-risk content categories (video, image); execute first round of internal audit trail testing.
  • Days 61–90: Extend SGI labelling to all surfaces (API, embedded, re-shared); finalise AI-vendor SLA amendments; complete first external compliance audit; submit compliance report to MeitY if required.

Industry observers expect MeitY to prioritise enforcement actions against SSMIs that fail to implement the 3-hour takedown for non-consensual deepfakes and impersonation content, given the high public salience of these categories. Platforms should allocate resources accordingly.

Conclusion and Recommended Next Steps

The IT Amendment Rules 2026 mark a structural shift in India’s platform-regulation landscape, moving from reactive content moderation toward proactive, infrastructure-level compliance. For TMT lawyers India practitioners and the organisations they advise, the message is clear: compliance is no longer a policy exercise but an engineering, contractual and operational programme that demands cross-functional investment. Platforms that act decisively, standing up takedown operations, deploying SGI labelling, and updating their contractual frameworks within the first 90 days, will be best positioned to maintain safe-harbour protection and build regulatory trust. Those seeking specialist guidance on implementation should consult experienced TMT lawyers with deep expertise in India’s evolving intermediary-liability and AI-governance framework.

Need Legal Advice?

This article was produced by Global Law Experts. For specialist advice on this topic, contact Siddharth Mahajan at Athena Legal Advocates & Solicitors, a member of the Global Law Experts network.

Sources

  1. Ministry of Electronics & IT (MeitY), Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026
  2. Hogan Lovells, India Introduces Mandatory Labelling for AI and 3-Hour Takedown for Illegal Content
  3. Nishith Desai Associates, AI-Generated Content and Combating Deepfakes: What India’s New Rules Mean for Global Platforms
  4. Grant Thornton India, India’s IT Rules 2026: Reshaping Platform Responsibility in the AI Era
  5. Vaish Associates, Checklist for Compliance with the IT (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026
  6. Kankrishme, India’s 2026 IT Rules Amendment: Regulating AI-Generated Content and Accelerating Compliance

FAQs

What do the IT Rules 2026 require of platforms?
Platforms must implement SGI labelling with tamper-proof metadata, comply with a 3-hour takedown SLA for notified unlawful content, strengthen due diligence and grievance procedures, and maintain auditable records of all compliance actions.
The rule requires platforms to remove or disable access to certain unlawful content, particularly non-consensual deepfakes and impersonation SGI, within three hours of receiving a valid notice from a government authority, court, or authorised complainant.
Labels must be clear, prominent and user-facing, accompanied by machine-readable metadata fields (generating tool, creation date, unique identifier). Anti-tampering safeguards under the rules prohibit removal or modification of labels and metadata.
No. Under the anti-tampering provisions of the rules, the SGI label, metadata and unique identifier must not be modified, suppressed or removed by any party, including the platform itself.
They expand due diligence requirements, introduce mandatory SGI labelling and strengthen record-keeping and audit obligations. Non-compliance risks loss of safe-harbour protection under Section 79, exposing platforms to direct liability for hosted content.
Liability is significantly mitigated where platforms can demonstrate they followed rule-prescribed procedures, preserved evidence and maintain ongoing due diligence. Platforms should deploy content-matching tools to detect and remove re-uploads proactively.
The rules require both user declarations and platform-side technical verification. Relying exclusively on automated detection is unlikely to satisfy due diligence requirements. A layered approach, combining user disclosure, automated detection and human review for edge cases, is the defensible standard.
A valid notice should identify the complainant, specify the content URL(s), describe the alleged violation, cite the relevant legal basis, and provide contact details. Platforms should publish notice templates and guidance to reduce invalid submissions that consume the 3-hour window.

Find the right Advisory Expert for your business

The premier guide to leading advisory professionals throughout the world

Specialism
Country
Practice Area
ADVISORS RECOGNIZED
0
EVALUATIONS OF ADVISORS BY THEIR PEERS
0 m+
PRACTICE AREAS
0
COUNTRIES AROUND THE WORLD
0
Join
who are already getting the benefits
0

Sign up for the latest advisor briefings and news within Global Advisory Experts’ community, as well as a whole host of features, editorial and conference updates direct to your email inbox.

Naturally you can unsubscribe at any time.

Newsletter Sign Up
About Us

Global Law Experts is dedicated to providing exceptional legal services to clients around the world. With a vast network of highly skilled and experienced lawyers, we are committed to delivering innovative and tailored solutions to meet the diverse needs of our clients in various jurisdictions.

Global Law Experts App

Now Available on the App & Google Play Stores.

Social Posts
[wp_social_ninja id="50714" platform="instagram"]
[codicts-social-feeds platform="instagram" url="https://www.instagram.com/globallawexperts/" template="carousel" results_limit="10" header="false" column_count="1"]

See More:

Contact Us

Stay Informed

Join Mailing List
About Us

Global Advisory Experts is dedicated to providing exceptional advisory services to clients around the world. With a vast network of highly skilled and experienced advisors, we are committed to delivering innovative and tailored solutions to meet the diverse needs of our clients in various jurisdictions.

Social Posts
[wp_social_ninja id="50714" platform="instagram"]
[codicts-social-feeds platform="instagram" url="https://www.instagram.com/globallawexperts/" template="carousel" results_limit="10" header="false" column_count="1"]

See More:

Global Law Experts App

Now Available on the App & Google Play Stores.

Contact Us

Stay Informed

Join Mailing List

GAE

Lawyer Profile Page - Lead Capture
GLE-Logo-White
Lawyer Profile Page - Lead Capture

TMT Lawyers India 2026: IT Rules, SGI Labelling, 3‑hour Takedowns & Intermediary Liability

Send welcome message

Custom Message