top of page
IMPACT OF EVOLVING CYBER GOVERNANCE POLICIES ON INTERMEDIARY LIABILITY REGIMES

Author: Vidisha Ramteke, School of law, Devi Ahiliya Vishvavidhayala Indore



ABSTRACT

This paper examines the dynamics of the evolution of cyber governance policies, including the legislative, regulatory, and judicial developments such as the Information Technology Act, 2000, the IT Rules, 2021 and their subsequent amendments. It also explores the transformation of the legal regime, from the traditional safe harbour protection under Section 79 of the Information Technology Act, 2000, to a more regulated, compliance-driven model under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. A doctrinal research method has been adopted, relying on secondary sources such as statutes, case law, policy documents, and scholarly works to analyse the evolving aspect of intermediary liability. The study also examines the broader implications of these regulatory developments for digital rights, including privacy, freedom of speech and expression, and legal certainty in the digital ecosystem. It concludes by emphasising the gaps among the provisions, state control and their impact on user rights. It calls for a more balanced and rights-protected framework.


Keywords: Cyber Governance, Intermediary Liability, Safe Harbour


INTRODUCTION

Information and technology are rapidly advancing, and digital platforms have transformed intermediaries such as social media and messaging services into central actors in the online ecosystem. Under Section 79 of the Information Technology Act, 2000, intermediaries were granted safe harbour protection, ensuring limited liability for third-party content, provided they acted as neutral conduits. However, recent developments, particularly the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 and subsequent amendments, reflect a shift towards increased regulatory control and platform accountability. Measures such as traceability requirements and content regulation mechanisms have raised concerns regarding privacy and freedom of expression.

Existing literature largely examines these issues in isolation and lacks an integrated analysis of how these evolving policies collectively interact. Much of the parts focus either on increased accountability or on the protection of fundamental rights, without adequately balancing the two. This paper aims to address this gap by critically analysing the transformation of intermediary liability regimes in India and their implications for both users and platforms.


RESEARCH QUESTIONS

1. To what extent have evolving cyber governance policies in India transformed intermediary liability regimes, and what are their implications for privacy, freedom of expression, and legal certainty?

2. To what extent has the evolving intermediary liability regime in India generated both regulatory advantages and practical challenges for intermediaries and users?


LITERATURE REVIEW

Shakti Jayanth S. (2021), A Critical Analysis of Liabilities of Intermediaries with Reference to the Information Technology Intermediary Rules, 2021.

In this study, the author examines the practical implications of recent intermediary regulations and the issue of censorship in the digital media landscape. The author discusses concerns relating to data confidentiality and messaging applications, particularly in the context of end-to-end encryption. It is argued that such measures may violate individuals' right to privacy and expose user data to security risks, as intermediaries may be required to store information to identify message origins and track forwarding chains. The study also highlights due diligence and the role of compliance officers in monitoring content. While the study is well-researched and presents strong arguments, it lacks a clear methodological framework and does not propose practical solutions to address the issues identified.


Akriti Shashank, Prof. Dr. Monika Rastogi & Sweksha Bhadauria (2025), Social Media and the Right to Free Speech in India: Constitutional Challenges, Legal Boundaries, and the Role of Judicial Oversight in the Digital Era.

In this study, the authors critically examine the constitutional challenges, legal boundaries, and the role of judicial oversight concerning the right to freedom of speech and expression in India. The study analyses Article 19 of the Constitution and its role in guaranteeing individual liberty and free speech. It further highlights that the concept of harmful speech is context-dependent and varies across different social and cultural settings. The authors argue that excessive governmental control over intermediaries may lead to disproportionate regulation of online discourse, thereby concentrating power over what can be expressed in the digital space. However, while the study provides a strong constitutional analysis, it lacks a clear framework for defining the limits of free speech in practical terms. It also does not sufficiently address how regulatory policies can be effectively implemented while ensuring the protection of citizens' freedom of expression.


Rupali Agrawal (2023), Intermediary Liability in the Context of Online Platform: Comparative Analysis.

The study examines the concept of intermediary liability in the context of online platforms through a comparative analysis of legal frameworks across India, the United States, and the European Union. The study further highlights the evolving role of intermediaries in facilitating digital communication and explores the balance between safe harbour protections and increasing regulatory obligations. The paper also discusses different forms of intermediary liability, such as strict liability, safe harbour protection, and criminal liability, while examining judicial approaches and case laws shaping the Indian framework. Additionally, it provides a comparative perspective by analysing Section 230 of the United States and the European Union's regulatory approach, particularly the Digital Services Act, highlighting the need for a balanced regulatory model. While the study provides a comprehensive doctrinal and comparative analysis, it remains limited in its critical examination of the cumulative impact of these evolving policies on digital rights, such as privacy and freedom of expression.


Ankita Sharma & Dr. Vivek Malik (2025), Freedom of Speech in The Digital Age: A Legal and Socio-Political Study of Social and Electronic Media in India.

In this study, the authors examine the right to freedom of speech and expression in India within the context of social and electronic media, focusing on its constitutional foundation under Article 19(1)(a) and the scope of reasonable restrictions under Article 19(2). The study analyses how the proliferation of digital platforms has transformed public discourse by enabling wider access to information and civic participation, while simultaneously contributing to challenges such as misinformation, hate-driven content, cyber libel, and regulatory overreach. It further explores the role of statutory frameworks, particularly the Information Technology Act, 2000 and the Intermediary Guidelines, 2021, in governing online speech, as well as judicial developments that define the contours of digital expression. The authors also adopt a mixed-methods approach, combining doctrinal analysis with empirical data, including surveys and interviews, to assess public perceptions of censorship, platform regulation, and state intervention. While the study provides a comprehensive socio-legal and empirical understanding of free speech in the digital age, it remains broader in scope and does not specifically focus on intermediary liability as an independent legal framework.


RESEARCH METHODOLOGY

This study adopts a doctrinal research method, relying primarily on secondary sources. The analysis is based on relevant statutory provisions, including the Information Technology Act, 2000, the Information Technology Rules, 2021, and their amendments. It also draws upon judicial decisions interpreting intermediary liability and safe harbour protections. Additionally, the research incorporates scholarly works, books, journal articles, and legal commentaries to examine the evolving cyber governance framework and its impact on intermediary liability regimes.


STATUTORY PROVISIONS AND CASE LAWS

India has witnessed significant developments in its cyber regulatory framework alongside the rapid rise in the use of intermediaries. Initially, the Information Technology Act 2000 was enacted to provide protection to intermediaries through safe harbour provisions, recognising their role as neutral facilitators. However, with the rise in the use of digital platforms, intermediaries began to play a more influential role in the spread of content, raising concerns about misuse, misinformation, and potential threats to public order. In response, the government introduced various amendments and regulatory measures, culminating in the Information Technology Rules, 2021. These measures imposed additional obligations and limitations on intermediaries to enhance accountability and control over online content. However, such regulatory expansion has also raised concerns about the potential infringement of fundamental rights, particularly freedom of speech and expression.


Information Technology Act, 2000

Under Section 2(1)(w) of the Information Technology Act, 2000, An intermediary is defined as any person who, on behalf of another person, receives, stores, or transmits electronic records or provides any service with respect to such records.

The foundation of intermediary liability in India is laid under Section 79 of the Information Technology Act, 2000, which provides conditional immunity to intermediaries from liability for third-party information, data, or communication links hosted on their platforms. Though it comes with certain restrictions, it applies when the intermediary neither initiates transmission, selects the receiver, nor modifies the information contained in the transmission.


Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021

As mentioned under Rules 3 and 4 of the IT Rules 2021, It imposes due diligence obligations on intermediaries, requiring them to publish rules and regulations, privacy policies, and user agreements, and to ensure that users do not host or share unlawful information. Furthermore, Rule 4(2) mandates that significant social media intermediaries providing messaging services must enable the identification of the first originator of information in certain cases relating to offences such as public order and national security.


Article 19 of the Indian Constitution

Article 19 guarantees certain fundamental freedoms to citizens, including the right to freedom of speech and expression under Article 19(1)(a). This right is essential in the digital context, as it protects individuals' ability to communicate and participate in online discourse. However, it is not absolute and is subject to reasonable restrictions under Article 19(2) on grounds such as public order, security of the State, decency, and defamation. In this context, measures such as content-takedown obligations, traceability requirements, and fact-checking mechanisms raise concerns because they may expand State control over online speech and risk exceeding permissible limits, thereby chilling free expression.


Case Laws

Shreya Singhal v. Union of India

In Shreya Singhal v. Union of India, The Supreme Court clarified that intermediaries cannot be required to independently determine the legality of content and are only obligated to remove information upon receiving a valid court order or a government notification issued in accordance with law. This interpretation was crucial in preserving the safe harbour protection under Section 79 and preventing excessive censorship by intermediaries. This case limits intermediary liability by requiring actual knowledge via a court or government order, which directly conflicts with the proactive monitoring expectations under the 2021 Rules.

K.S. Puttaswamy v. Union of India

In K.S. Puttaswamy v. Union of India, The Supreme Court recognised privacy as a fundamental right subject to the tests of legality, necessity, and proportionality. The mandatory identification of the first originator risks failing the proportionality test, as it imposes a blanket intrusion on encrypted communications without adequate safeguards. Together, these developments indicate that the evolving intermediary liability regime places strain on both free speech and privacy protections, thereby necessitating stronger judicial oversight to maintain constitutional balance.


DISCUSSION: REGULATORY SHIFT AND ITS IMPLICATIONS

Safe Harbour to Conditional Liability

The intermediary liability regime in India has undergone a significant transformation from a system of limited liability under Section 79 of the Information Technology Act, 2000, to a framework based on conditional compliance. Safe harbour protection is no longer absolute and is increasingly dependent on adherence to due diligence requirements under the IT Rules, 2021. This shift reflects a move towards greater accountability but also narrows the scope of protection available to intermediaries.

Intermediaries are no longer treated as passive facilitators of content. The imposition of obligations such as monitoring, traceability, and content regulation has converted them into active regulators within the digital ecosystem. This transformation alters the foundational principle of intermediary neutrality and places a substantial burden on platforms to control user-generated content.


Privacy and Data Confidentiality

The introduction of traceability requirements and the use of automated tools for content filtering raise serious concerns regarding user privacy and data confidentiality. These measures may weaken end-to-end encryption and require intermediaries to engage in practices that expose user data to risks, including surveillance, misuse, and breaches. The wide expansion of content regulation mechanisms increases the procedure of over-compliance by intermediaries. In an effort to avoid liability, platforms may remove content pre-emptively, which can lead to a chilling effect on real speech.

The traceability requirement under the IT Rules, 2021, represents a disproportionate intrusion into user privacy. By mandating the identification of the first originator of information, intermediaries are compelled to weaken encryption systems, thereby exposing user communications to potential surveillance. This measure fails to satisfy the proportionality test laid down in K.S. Puttaswamy v. Union of India, as it applies broadly to all users rather than being narrowly tailored to specific threats. Consequently, traceability shifts the balance from targeted regulation to systemic surveillance, raising serious constitutional concerns.


Legal Uncertainty and Expansion of State Control

The use of broad and undefined terms such as 'due diligence' and 'misleading information' contributes to ambiguity within the regulatory framework. This lack of clarity creates uncertainty for intermediaries in determining their obligations and results in inconsistent enforcement. The evolving regulatory framework indicates increased State involvement in digital governance. Mechanisms such as traceability requirements and fact-checking measures reflect a shift towards greater executive control over online content. While such regulation may be aimed at maintaining order, it raises concerns regarding proportionality and the absence of sufficient safeguards.

Taken together, these developments demonstrate that intermediary liability now extends beyond regulating platforms to directly affecting user rights. The digital environment has become more restrictive, highlighting the need for a balanced approach that ensures accountability while protecting fundamental rights such as privacy and freedom of expression.


Privatisation of Censorship

The evolving regulatory framework results in the privatisation of censorship, wherein intermediaries, under the threat of liability, engage in pre-emptive content removal. This phenomenon is driven by vague compliance standards such as 'due diligence,' which incentivise platforms to over-censor content to avoid regulatory consequences. As a result, decisions affecting freedom of speech are increasingly made by private actors rather than courts, undermining procedural safeguards recognised in Shreya Singhal v. Union of India. This shift not only erodes transparency but also creates a chilling effect on legitimate expression, particularly dissenting or controversial viewpoints.


Judicial Supervision

The judiciary plays a significant role in ensuring that regulatory measures governing intermediaries do not infringe on fundamental rights, particularly freedom of speech and expression. As there is a thin line between permissible restriction and unconstitutional censorship, it is essential that any limitation on online content is subject to judicial scrutiny. The judiciary acts as a safeguard against arbitrary State action by ensuring that restrictions comply with constitutional principles, including reasonableness, necessity, and proportionality.

In the context of intermediary liability, increased regulatory obligations, such as due diligence and content-takedown mechanisms, may lead to excessive or pre-emptive censorship by platforms. Courts play a vital role in maintaining the balance between accountability and the protection of individual rights, particularly in cases involving online speech and digital expression. Further, there is a need to strengthen institutional mechanisms for addressing disputes related to online content. The establishment of specialised tribunals or expedited procedures for content-related grievances can ensure timely and effective redressal. Such mechanisms would reduce the burden on courts while ensuring that decisions affecting free speech are subject to fair and independent review.


COMPARATIVE ANALYSIS AMONG US AND INDIA

United States

The United States follows a relatively liberal approach to intermediary liability under Section 230 of the Communications Decency Act, which provides broad immunity to online platforms for third-party content. Intermediaries are generally not held liable for content posted by users and are also protected when they voluntarily remove harmful content in good faith. This framework promotes freedom of expression and innovation but has been criticised for enabling the spread of harmful or misleading content due to limited regulatory control.


European Union Approach

The European Union adopts a more structured and balanced model through the Digital Services Act. While intermediaries are granted safe harbour protections, they remain subject to clearly defined obligations, such as transparency, risk assessment, and accountability mechanisms. The framework emphasises user rights, due process, and proportionality, ensuring that content moderation is subject to safeguards, including notice-and-action procedures and avenues for appeal.


Concluding Comparison

The comparison highlights that while all jurisdictions aim to balance accountability and freedom, their approaches differ significantly. The U.S. prioritises freedom of speech, the EU focuses on balanced regulation with safeguards, and India is moving towards stricter control with expanding intermediary obligations. This indicates the need for India to adopt a more balanced and rights-oriented approach, incorporating clearer standards and stronger procedural safeguards.


RECOMMENDATIONS

Clarifying Legal Standards

There is a need to clearly define vague terms such as 'due diligence' and 'misleading information' within the regulatory framework. Ambiguity in these provisions leads to inconsistent enforcement and over-compliance by intermediaries. Clear statutory guidelines would ensure predictability and reduce arbitrary interpretation. This would also help intermediaries understand the extent of their obligations without risking liability.


Protecting Privacy and Encryption

Traceability requirements should be reconsidered to ensure that they do not undermine end-to-end encryption. Any interference with user data must satisfy the tests of legality, necessity, and proportionality as articulated in K.S. Puttaswamy v. Union of India. Safeguards must be introduced to prevent excessive data retention and misuse. Protecting encryption is essential to maintaining user trust and data confidentiality in digital communication.


Reducing Overreach in Content Regulation

Content regulation mechanisms should include independent oversight and transparent procedures. Intermediaries should not be compelled to act as arbiters of legality without judicial backing, as affirmed in Shreya Singhal v. Union of India. Introducing appeal mechanisms and accountability checks can prevent arbitrary takedowns. This would reduce the risk of censorship and protect freedom of expression.


CONCLUSION

The evolution of cyber governance policies in India reflects a clear transition from a safe-harbour-based regime to a more compliance-driven framework for intermediary liability. While these developments aim to enhance accountability and address emerging digital challenges, they have also raised concerns about privacy, freedom of expression, and legal certainty. The increasing regulatory obligations imposed on intermediaries, including traceability and content control, indicate a shift away from platform neutrality towards greater State influence over digital discourse.

At the same time, the absence of clear standards and safeguards creates risks of over-compliance and potential misuse, ultimately affecting user rights. Therefore, it is essential to adopt a balanced and rights-oriented approach that ensures accountability while preserving fundamental freedoms and maintaining the integrity of the digital ecosystem.








Related Posts

RECENT POSTS

THEMATIC LINKS

bottom of page