This article has been written by Saswati Soumya, pursuing the Diploma in Cyber Law, FinTech Regulations and Technology Contracts from LawSikho. The article has been edited by Smriti Katiyar (Associate, LawSikho).
Table of Contents
Introduction
The common argument with respect to declaring oneself as a social media intermediary is that they do not exercise editorial control over content that is produced by users. Having exclusive rights over the content produced by creators under mutual contractual agreements does not alter the status of a legal person as an intermediary. This is because having exclusivity rights is related to the obligation of an intermediary to prevent copyright infringement and in stopping the spreading of sexually explicit content. This became a contentious issue for content platforms like Sharechat and TikTok. Scaling operations specific to India’s climate required the appointment of the chief compliance officer, local grievance officer, the nodal officer either on an interim basis or on a permanent basis to look after such issues.
“Intermediaries” under the provisions of the IT Act, 2000
Section 2(1)(ua)(w) of the Information Technology Act, 2000 defines an intermediary. The essence of the definition is that one legal person acts on behalf of another person. Some examples of intermediaries are telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online auction sites, online marketplaces and cyber cafes. The second aspect of the definition is that the agency role of an intermediary is limited to the role of (i) receiving, storing, or transmitting a particular electronic record; or (ii) providing a service with respect to that electronic record.
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
Rule 3 of Part III of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021(“Digital Media Ethics Code”) lists the due diligence that is required to be observed by the intermediary while discharging its duties. The scope of intermediary includes both social media intermediary and significant social media intermediary. Rule 2 (v) and (w) differentiate a significant social media intermediary from a social media intermediary. As per the provisions of the Digital Media Ethics Code, a significant social media intermediary means a social media intermediary that has a number of registered users in India above a specific threshold that is prescribed by the Central Government. On the other hand, a social media intermediary refers to an intermediary that primarily or solely enables online interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services.
It is imperative for an intermediary to inform its users not to host, upload, modify, publish, transmit, update or share any information which:
(1) does not belong to the user as a matter of right;
(2) is inconsistent with or contrary to the laws in force, such as being defamatory;
(3) is harmful to a child;
(4) infringes intellectual property rights;
(5) violates a law;
(6) deceives/misleads the addressee about the origin of the message;
(7) impersonates another person;
(8) threatens the unity & integrity of India;
(9) contains software virus that is designed to interrupt, destroy or limit the functionality of a computer resource;
(10) is patently false and untrue.
These are a few examples of information that the user is not allowed to share while using the services of an intermediary. It is on the intermediary to expressly provide this restriction in its rules, regulations, and privacy policy or user agreement so that its users are informed. This information will be published either on the website of the intermediary or in the mobile-based application or both. Further, it is on the intermediary to periodically inform its users about its right to terminate the access of the user to its services and thereby stopping their usage rights in case of non-compliance with the rules and regulations, privacy policy or user agreement. Such periodic informing exercise must happen at least once a year. The other path is that an intermediary may ask its users to remove non-compliant information or both depending on case to case basis. If prohibited information is hosted, stored or is published on an intermediary’s computer resource, then it is important that the intermediary gets “actual knowledge” of the same. When an intermediary receives an order by a court of competent jurisdiction or receives notification from an appropriate government or its agency under Section 79(3)(b), then it shall not host, store or publish information that is prohibited. Information is construed as being prohibited if it relates to:
(1) sovereignty and integrity of India;
(2) security of the State;
(3) friendly relations with foreign State;
(4) public order;
(5) decency or morality;
(6) contempt of court;
(7) defamation;
(8) incitement to an offence that relates to the above;
(9) any other information that is prohibited under any law for the time being in force.
The intermediary shall remove such prohibited content that is hosted, stored and published as early as possible. It may also disable access to that information no later than 36 hours from the receipt of the court order or on being notified by the appropriate government or its agency. If the intermediary removes or disables access to information, data or communication link as prescribed above on a voluntary basis or on the basis of grievances received, then the response of the intermediary would not amount to a violation of Section 79(2)(a) and (b). On the other hand, the intermediary might be required to preserve information and associated records for 180 days for investigation purposes or for a longer period that may be required by the court or by government agencies who are lawfully authorized. The intermediary shall provide information under its control or possession, not later than 72 hours after receiving an order. Such an order is required to be in writing clearly stating the purpose of seeking information or assistance.
A user or a victim may lodge a complaint against a violation of the rules and regulations with the Grievance Officer. The intermediary shall publish the name of the Grievance Officer and contact details prominently in its website or mobile-based application or both. The Grievance Officer shall acknowledge the receipt of the complaint within 24 hours and shall dispose of such complaint within 15 days of receiving the complaint. Within 24 hours from receiving the complaint made by an individual or any person on his behalf, the intermediary shall take all reasonable and practicable measures to remove or disable access to content that is hosted, stored, published or transmitted in its computer resource. The content can either be a material that exposes the private parts of such individual, or shows an individual in full or partial nudity, or shows or depicts such individual in any sexual act or conduct, or is in the nature of impersonation in an electronic form, including artificially morphed images of such individual. The intermediary shall implement a mechanism for receiving complaints that would enable an individual to provide details.
The Digital Media Ethics Code prescribes additional due diligence measures that should be observed by a significant social media intermediary (“SSMI”). The SSMI shall appoint a Chief Compliance Officer (“CCO”). Such officers shall be key managerial personnel or a senior employee of SSMI, who is also a resident in India. Such officers shall be responsible for ensuring that the provisions of the rules & regulations are complied with. Such officers shall be liable if the intermediary does not observe due diligence, since a significant part of their duty is to ensure that prohibited information in the form of third party information, data or communication link that is made available or hosted by the intermediary is removed, amongst other possibilities. In addition to appointing a CCO, and SSMI will appoint a nodal contact person. Such person shall ensure coordination with law enforcement agencies and other officers, 24*7 in order to take care of the requisitions or orders that are made by them. An SSMI shall also appoint a Resident Grievance Officer. Apart from appointing such personnel, the SSMI shall publish periodic compliance reports every month. Such a report shall mention the details of complaints that are received and action that is taken in response to such complaints. This report shall also mention the specific number of communication links/parts of information that the intermediary has removed or has disabled access to in pursuance of proactive monitoring that is conducted by using automated tools. If an SSMI removes or disables access to any information, data or communication link on its own accord, then it is required to ensure that it provides the user with a notification explaining the action that is being taken and the grounds or reason for taking such an action. Such intimation shall be provided to the user who has created, shared, disseminated or modified the information, data or communication link prior to the time when the intermediary is removing or disabling access to such information. Moreover, it shall ensure that such a user receives a reasonable opportunity to dispute the action that is being taken by the intermediary. Such users may also request for the reinstatement of access to such information, data or communication link that can be completed within a reasonable time. The Resident Grievance Officer of such intermediary shall maintain appropriate oversight of the mechanism for resolution of disputes between the user and the intermediary.
Judicial approach
Akansha and Bikram are friends. Akansha uploads a photo that Akansha and Bikram took together, on Facebook. Later once the photo was posted, Bikram did not like the comments from Akansha’s friend list. Bikram asked Akansha to remove the photo for this reason. Akansha hesitated to do so. In such a circumstance, what is the best way forward? Would a complaint about an inappropriate comment suffice? Or, would removing one another from their friend list be an appropriate remedy for such a grievance?
Shreya Singhal’s judgment deals with the issue of deciding the acceptable norm for communication in the online world. The case generally and specifically discussed at length the question of whether and the extent to which Section 66A of the Information Technology Act, 2000 is constitutional. The judgment does not explicitly deal with the role of intermediary in deciding the acceptable norm as such and in deciding its liability. Rather, it compares the online and offline world in terms of understanding the intent behind the “marketplace of ideas” and links it to answer the constitutional question. It was decided that there is no intelligible deferentia between the medium of print, broadcast and real-live speech as opposed to speech on the internet. Thus, new categories of criminal offences cannot be made on this ground.
- The offence created by Section 66A is vague and quite broad. Thus, it is unconstitutional under the provisions of Article 19(1)(a) and is not saved by Article 19(2).
- Section 69A and the Information Technology (Procedure & Safeguards for blocking for access of information by the public) rules, 2009 are constitutionally valid.
- Section 79 is valid subject to Section 79(3)(b) being read as, “an intermediary upon receiving actual knowledge from a court order or on being notified by the appropriate government or by its agency that unlawful acts related to Article 19(2) are going to be committed, and then fails to expeditiously remove or disable access to such material.”
- Information Technology “Intermediary Guidelines” Rules 2011 are valid subject to Rule 3(4) being read in a manner provided in the judgment.
- Section 118(d) of the Kerala Police Act is struck down as being violative of Article 19(1)(a) and not saved by Article 19(2).
Legal analysis
There is a “value gap” between copyright holders and internet platforms such as Dailymotion, Youtube and Soundcloud. As per the music and entertainment industry, the right-holders in a user-generated content platform (“UGC platform”) complained that the law did not provide them with the ability to monetize in UGC platforms. Instead, the law merely provided them with liability exemptions for UGC platforms, lack of monitoring obligations and notice and takedown regime.
Content is adequately protected if infringing content is removed once notice is served, as opposed to entering into a licensing agreement. If one enters into a licensing agreement with content creators and thereby cannot avail of liability exemption provisions, then the intermediary may think of paying licensing fees to content creators in order to maintain competition in the market of UGC platforms. This shows the “value gap” in the audio-visual industry. The other rhetoric apart from “value gap” is “value grab”. This narrative argues that the internet is treated as a digital threat as opposed to being a digital opportunity space. As per Giancarlo, the empirical evidence and literature show that the digital platform economy has created value for content providers as opposed to creating a value gap that needs to be closed. On the topic of intermediary liability, two types of liabilities have emerged, namely primary liability and secondary liability. A hosting provider can still be protected even if it is not completely passive as long as it does not have knowledge or control over the data that is stored.
From the perspective of a European Union lawyer, it is important to assess liability issues from the framework of digital single market strategy (“DSMS”). DSMS focuses on four main issues, namely, “(1) cross border access to content; (2) text mining and data mining; (3) civil enforcement; and (4) role of internet service providers (“ISPs”).” The role of ISPs in distributing content raises questions about its role in safeguarding copyright-protected works, namely, are the creators being fairly remunerated? Does the current law on intermediary liability encourage the generation of content in the future? The content which is deemed to be illegal, i.e, child pornography, terrorist material and content that infringes upon intellectual property rights is not included in the ambit of “content” that the intermediary needs to distribute. The assessment of content is critical in interpreting the responsibility exercised by online intermediaries in the form of due diligence and duty of care. The topic of intermediary liability plays a critical role in identifying the causes for which the aggrieved party can argue that, it is not prudent for the internet service provider to take the leeway that it is merely acting as a hosting provider and/or access provider, and is thus, entitled to receiving the advertising revenues because it is not acting as a service provider, as such. A provision is said to provide a “safe harbour” when it does not require intermediaries to monitor the information that it transmits or stores. The intermediary does not actively seek facts or circumstances that are indicative of illegal activity.
From the context of intermediary liability, a progressive legislative endeavour in the form of court rulings may answer the following questions, namely:
- What are the exceptions to forming a digital cross border environment? In simple words, this refers to the content that is required to be removed.
- What is the exclusive right that is exercised by the intermediary? What are the issues faced by the intermediary when it links one content with another content? In what circumstances, the intermediary act as a news aggregator?
- What cases necessitate the enforcement method of the “follow the money” strategy? Is it justified to apply this strategy in infringements that are commercial in nature?
- Under what circumstances, should provisional and precautionary measures be applied? The provisional and precautionary measures are taken keeping in mind the effect of (1) injunctions in a cross border environment; (2) notice and actions that follow notice being served in a party; (3) “takedown and stay down” principle.
- Is it always clear to identify that an intermediary is acting “technical, automatic and passive”? Do the new situations merit the re-categorization of intermediaries? If so, will the duties of care be different?
- Is the “takedown and stay down” principle the same as monitoring prospectively for flagged content? Does a specific type of content require a specific notice and action procedure?
- Will algorithmic tools be able to privatize law enforcement?
Conclusion
Section 79 of the IT Act, 2000 provides for exemption from liability, if the intermediary adheres to Section 79 (2). Thus, the legal issues associated with the liability of intermediaries are linked with its roles and responsibilities in preventing the dangers of the internet. The private parties pursue actions against intermediaries under the principles of tort law. For instance, credit card companies will prevent the processing of payments for the illegal sales of cigarettes over the internet. The other example relates to preventing the online sale of prescription drugs. To conclude, an intermediary can be shielded from liability if it adheres to Section 79 (3) of the IT Act, 2000.
Students of Lawsikho courses regularly produce writing assignments and work on practical exercises as a part of their coursework and develop themselves in real-life practical skills.
LawSikho has created a telegram group for exchanging legal knowledge, referrals, and various opportunities. You can click on this link and join:
https://t.me/joinchat/J_0YrBa4IBSHdpuTfQO_sA
Follow us on Instagram and subscribe to our YouTube channel for more amazing legal content.