India has announced stricter rules for social media platforms regarding unlawful content. The government now requires companies to remove illegal material within three hours of receiving official notice. This is a major change from the previous 36-hour deadline.
The updated regulations amend Indiaโs Information Technology (IT) rules introduced in 2021. These rules have often caused tensions between the government and global technology firms. The new changes will take effect from February 20.
According to the government, platforms must act quickly when content violates national laws. This includes material linked to national security, public order, or other legal concerns. However, the directive did not explain why the timeline was reduced.
Legal experts believe the new deadline may be difficult to meet. Akash Karmakar, a technology law specialist, said it is almost impossible to remove content in three hours. He added that companies need time to review requests and ensure proper checks.
India is already known as one of the strictest regulators of online content. The country has given wide powers to government officers to issue takedown orders. This has raised concerns among digital rights groups. They fear the rules may increase censorship.
Major platforms like Meta, which owns Facebook, did not comment on the changes. X and Google, which runs YouTube, also did not respond immediately. These companies operate in a large market with over one billion internet users in India.
In recent years, India has issued thousands of content removal requests. Transparency reports show that Meta restricted more than 28,000 pieces of content in India in the first half of 2025 alone. These actions were taken following government demands.
Globally, governments are pushing social media firms to take more responsibility. Countries across Europe and South America are also demanding faster action on harmful content. However, critics argue that extremely short deadlines may limit platformsโ ability to assess cases fairly.
The amended rules also changed how artificial intelligence content should be labelled. Earlier proposals required labels to cover a fixed portion of content. Now, platforms must ensure AI-generated material is clearly and prominently marked.
Some industry executives say the new rules were introduced without proper consultation. They believe international standards usually allow more time for compliance. Still, India maintains that stronger controls are necessary to manage harmful content.
In other news read more about Pakistan Confirmed to Play India in T20 World Cup 2026, Boosting ICC Revenue
With these changes, India continues to strengthen its regulatory grip on digital platforms. The move places more pressure on companies to balance legal compliance with free expression.




