Such a move will, however, be made only if social media and internet intermediaries continue to roll out the three-hour window without any major hiccups, an official said.
“This is under consideration, but there are no immediate plans to roll this out. Currently, we have a three-hour window in place, and we are receiving constant feedback from all intermediaries on the implementation,” the official said.
The IT ministry will soon start reaching out to all stakeholders, including executives from social media and internet intermediaries, to discuss the feasibility and viability of the proposed one-hour window, another official said.
“Given the size and spread of the Indian internet, the possibility of virality of content here is much higher, and therefore, it is important to act on illegal content that much faster,” the second official said.
An email sent to the IT ministry seeking its response to the proposed shortening of the timeline did not elicit a response till press time.
In February this year, the notified amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, stated that objectionable and unlawful content must be taken down within 3 hours of the intermediary being notified of its presence on the platform.
Further, non-consensual intimate imagery must be removed within one hour of the intermediary being notified, according to the new amended rules, which came into effect on February 20.
The push to make compliance timelines stricter for social media intermediaries has gained ground in India, especially after the rise of synthetic content.
In February, shortly after the amended IT rules came into effect, Union Electronics and Information Technology Minister Ashwini Vaishnaw had reiterated that social media and internet intermediaries must take responsibility for the content hosted on their platforms to make them safer for children, women, and other online users.
“Platforms must wake up, must understand the importance of reinforcing trust in the institutions that human society has created over thousands of years,” Vaishnaw had said, adding that social media platforms, which do not adopt adequate safety measures to keep their users protected from such harmful content, would be held liable.