Ministry of Electronics & IT
Government has issued multiple advisories emphasizing the observance of due diligence obligations under the IT Act and IT Rules
Regulatory framework strengthened to address harms arising from synthetically generated information (SGI), including deepfakes and AI-generated content
Posted On:
25 MAR 2026 3:55PM by PIB Delhi
The policies of Government of India are aimed at ensuring open, safe, trusted and accountable cyberspace for users in the country. The Government is cognizant of the risks and harms arising from the misuse of digital technologies including the social media. The Information Technology Act, 2000 (“IT Act”) and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“IT Rules”), together, have established a framework to deal with unlawful content in the digital space.
Advisories and Standard Operating Procedure (SOP):
The government has issued multiple advisories to the intermediaries, including the Social Media Intermediaries, emphasizing the observance of due diligence obligations under the IT Act and IT Rules. The details of the advisories issued and SOPs formulated are as follows:
- An advisory dated 26.12.2023 was issued to intermediaries to ensure compliance regarding observance of due diligence obligations under the IT Rules
- An advisory dated 15.03.2024 was issued to intermediaries to ensure compliance with statutory due diligence obligations under the IT Act and IT Rules. The advisory specifically addresses risks arising from computer resources that permit the synthetic creation or modification of text, audio, and audiovisual content
- A Standard Operating Procedure (“SoP”) to curtail dissemination of Non-Consensual Intimate Imagery (NCII) content on online platforms has been formulated and released on 11.11.2025. The SoP provides detailed guidance for victims, intermediaries and law enforcement agencies to ensure prompt and uniform action against the online dissemination of NCII content including intimate or morphed images shared without consent
- An advisory dated 29.12.2025 was issued to intermediaries, including the Social Media Intermediaries, reiterating the observance of statutory due diligence obligations by the Intermediaries under the IT Act and IT Rules, for preventing hosting, publication, transmission, sharing, or uploading of vulgar, indecent, obscene, pornographic and other unlawful content on their platforms
- An advisory dated 09.02.2026 was issued to intermediaries regarding the responsible handling of information relating to religious matters or otherwise unlawful information
- An advisory dated 16.03.2026 was issued to intermediaries with respect to the generation, hosting, publication, transmission, sharing or uploading of abusive, defamatory, objectionable, derogatory and misleading synthetically generated information
On 10th February, 2026, the Government strengthened the regulatory framework by amending the IT Rules to address harms arising from synthetically generated information (SGI), including deepfakes and AI-generated content.
Key points related to the amendment are as follows :-
- Intermediaries and social media platforms to deploy reasonable technical measures to prevent the creation and dissemination of unlawful AI-generated content, including content that is obscene, misleading, impersonating individuals, or harmful to children
- Platforms are also required to ensure clear labelling and traceable metadata for permissible AI-generated content, so that users can easily identify synthetically generated material and prevent deception or misuse
- It further strengthens user accountability and platform due diligence, including mandatory user awareness regarding legal consequences of unlawful AI-generated content and stronger compliance obligations for social media intermediaries
- Importantly, the guidelines explicitly covers child sexual exploitation material, non-consensual intimate imagery, impersonation and other harmful AI-generated content, requiring platforms to prevent such content and take prompt action when detected
- Intermediaries are obligated to deploy reasonable and appropriate technical measures, including automated tools or other suitable mechanisms, to not allow any user to create, generate, modify, alter, publish, transmit, share, or disseminate, as the case may be, any synthetically generated information that violates any law for the time being in force
- Social media platforms and other intermediaries are required to remove unlawful content within three hours of the receipt of an order of a court of competent jurisdiction or a reasoned intimation by the Appropriate Government or its agency
This information was submitted by Union Minister of State for Electronics and Information Technology Shri Jitin Prasada in Lok Sabha on 25.03.2026.
***
MSZ
(Release ID: 2245053)
Visitor Counter : 391