devxlogo

Delhi High Court orders deepfake regulation meet

Deepfake Regulation
Deepfake Regulation

The Delhi High Court has directed the Central government to conduct a meeting with providers and deployers of deepfake technology, telecom service providers, victims of deepfakes, and intermediaries such as Facebook and X (formerly Twitter), to finalize suggestions on the detection and removal of deepfakes. The Division Bench of Chief Justice Manmohan and Justice Tushar Rao Gedela passed the order on Wednesday after the Ministry of Electronics and Information Technology apprised the Court that it would soon constitute a dedicated committee to give suggestions on regulating deepfake technology. The High Court instructed the committee to offer recommendations on the creation, detection, and removal of deepfakes while considering regulatory and statutory frameworks in other countries, specifically those in the European Union.

The Division Bench passed the order while hearing two writ petitions filed by Rajat Sharma, Editor-in-chief of India TV, and Advocate Chaitanya Rohilla. Advocate Darpan Wadhwa represented Rajat Sharma, while Rohilla appeared in person. The High Court granted the sub-committee on deepfake regulation three months to submit its report and posted the matter for further hearing on March 25, 2025.

In a status report presented to the High Court, the Ministry of Electronics and Information Technology (MeitY) mentioned that a sub-committee was formed through an office memorandum issued on November 20. This sub-committee comprises members from the Emerging Technologies division, Cyber Security division, and Cyber Law division of MeitY, augmented by representatives from the Indian Cybercrime Coordination Centre (I4C), the Centre for Development of Advanced Computing, Hyderabad, and the Data Security Council of India. It also includes a professor from the Indian Institute of Technology, Madras, and one legal representative.

See also  Social Media Faces Jury On Youth Harm

High Court mandates deepfake regulation meeting

The High Court noted that the status report did not disclose the names of the committee members. Additional Solicitor General Chetan Sharma assured the Court that the process of nominating members would be expedited.

It was clarified that a sub-committee formed in March 2023 had already submitted its report on regulating Artificial Intelligence technology, but the newly-formed committee would focus specifically on deepfakes. Advocate Wadhwa, appearing for Sharma, argued that social media intermediaries would be primary stakeholders in regulating deepfakes. Consequently, the High Court directed the committee to consider suggestions from intermediaries such as Facebook and X (formerly Twitter).

Wadhwa also suggested reducing the compliance period for the removal of prohibited content from the current 72 hours. Rohilla’s petition sought directives for the government to identify and block websites providing access to deepfake technology, issue dynamic injunctions, set guidelines for AI regulation, ensure fair implementation of AI, and issue guidelines for AI and deepfake access in strict accordance with fundamental rights. The petition addressed concerns about privacy violations, and economic and emotional damage caused by deepfakes.

The matter is scheduled for further hearing on March 25, 2025.

Cameron is a highly regarded contributor in the rapidly evolving fields of artificial intelligence (AI) and machine learning. His articles delve into the theoretical underpinnings of AI, the practical applications of machine learning across industries, ethical considerations of autonomous systems, and the societal impacts of these disruptive technologies.

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.