Kill the Messenger? Or, not? : Intermediary Liability on Child Pornography
Has technology acted as multiplying force in creating a new forum for offline forms of crime against children? Has anonymity eased manipulation of child’s sensitivities? How do intermediaries design a specific tool that can protect a child from seeing a harmful image but not a citizen away from political or cultural information online?
With the total number of Internet subscribers reaching 350.48 million, rampant distribution of child pornographic content (“CPC”) through the internet has raised alarm with specific legislation on child pornography being absent in 35 countries. Though the domestic law is at par with the global practice in combating CPC, the real question is, how far policy and program related rhetoric will travel to the realm of progressive empirical reality for children?
The following has aggravated the current scenario in India:
? 147% increase in number of web pages containing child sexual abuse material (“CSAM”) from 2012 to 2014, with children of 10 years old or younger being portrayed in 80% of these materials;
? Absence of independent empirical research and analysis on patterns of online behaviour of children;
? Marketing oriented research by Information and Communication Technology (“ICT”) companies;
? Absence of national image database and prominent display of grievance reporting facilities;
? Skewed picture of offences against children by National Crime Records Bureau (“NCRB”) with focus only on reported cases;
? Lack of safeguards in digital devices at home & in Internet cafes; and
? Patchy enforcement of Cyber Café rules mandating cyber-cafes to monitor content.
Connecting offline with online: Legal Snapshot
In a country, where the abuse and the abused are both stigmatised, where people are choked with a sentiment of not reporting, various laws and policies have been put in place.
A. Protection of Children from Sexual Offences Act, 2012 (“POCSO”)
POCSO is centralised on the protection of children from sexual harassment, with “sexual intent” being ascertained per factual matrix; and preparation, production, offering, transmitting, publishing, facilitation and distribution of pornographic material through any medium irrespective of intent.
POCSO also penalises storage of pornographic material involving a child for commercial purposes and abetment and attempt to commit such offence. It widens reporting of such events, by any person who has the apprehension of the commission of an offence or has the “knowledge” that such offence has been committed, and shall not incur civil or criminal liability, for giving information in good faith. It obligates any personnel of any facility, by whatever name called, to provide information to Special Juvenile Police Unit, or to the local police station on coming across any such material, and penalises on failure to report or record such incidence. It also imposes joint and several liability on the publisher or owner of media for acts and omissions of employees in relation to (a) disclosing child’s identity in any form unless permitted by the competent Special Court (“SC”) if such disclosure is in child’s interest; and (b) making report or comments on any child in any form of media, without having complete and authentic information, which may lower child’s reputation or infringe his privacy.
SCs shall have jurisdiction to try offences under Section 67B of the Information Technology Act, 2000 (“IT Act”) so far it relates to publication or transmission of sexually explicit material depicting children or facilitates abuse of children online, notwithstanding anything under the IT Act. The SC may also determine the age of the person, if any question arises in any proceeding and direct compensation for physical or mental trauma to the child, in appropriate cases. It also prescribes bodies to monitor implementation of POCSO.
B. IT Act
The IT Act deals with online offences against children. It penalises publishing or transmitting of material depicting children in a sexually explicit act in electronic form, including facilitators of online children abuse. However, this does not extend to any representation in electronic form, which is proved to be justified as being for the public good such that, it is in the interest of science, literature, art or learning or other objects of general concern; or which is kept or used for bonafide heritage or religious purposes. The intermediary shall not be liable for any third party information, data, or communication link hosted by him, apart from violations of Copyright Act 1957 or the Patents Act 1970. Such protection is conditional on (a) its limited functionalities in providing access to a communication system over which third party content is transmitted or stored; and not initiating transmission, selecting receiver of transmission or selecting/ modifying the information contained in transmission; and (b) its due diligence compliance. Further, such shield is broken when intermediary (a) has conspired or abetted or aided or induced whether by threats or promise or otherwise in commission of the unlawful act; (b) fails to expeditiously remove or disable access to any information, data or communication link residing in or connected to a computer resource controlled by the intermediary is being used to commit unlawful act, upon receiving “actual knowledge”, or on being notified by the appropriate Government or its agency. “Due diligence”, obligates intermediary to inform its users not to host, display, upload, modify, publish, transmit, update or share any information, which is inter alia, obscene, pornographic, paedophilic, or otherwise unlawful in any manner whatever, harm minors in any way or, threatens public order etc. via rules and regulations, privacy policy and user agreement. It shall not knowingly host or publish any such information. In relation to CPC, intermediaries are obligated to preserve and retain such information for specified manner and duration and are penalised when they “intentionally or knowingly” fail to do so.
C. ISP Licensing Regime
Telecommunication service providers, network service providers, and ISPs are mandated to ensure prevention of carriage of objectionable content, post such instances are reported and enable lawful monitoring and interception of communications by Government.
D. Analysing Intermediary Liability
The debate centralises between two liability regimes: strict liability or fault based. On a conjoint reading of provisions of both acts, it is clear that application of mind is required to assess the objectivity of the content based on facts and context. It would be a violation of natural justice principles, if a neutral intermediary would deliberate on the subjectivity of the alleged content. With overseas based servers of a majority of intermediaries, lack of uniformity across the globe on the age of children and exploding content every micro-second without any human control, it might be quite a task to burden the intermediary for being a “publisher”, who do not possess any editorial control of the content being uploaded. In relation to the requirement of pre-filtering content by intermediaries, apart from technical challenges viz, “https” websites with encrypted contents are used to transmit the pornographic content which makes filtering difficult; there is also a probability of filtering genuine content and degrading the performance of systems. Fear of persecution by the intermediary will lead to a chilling effect on free expression wherein the subject indulges in excessive self-censorship in order to avoid undue and unfair liability. The obligation to report and be penalised on failure to report, under POSCO would be an added burden on the intermediary when it shall be spiralled in interpreting actual and constructive knowledge.
However, the intermediary may be strictly liable for “intentional” failure of adopting data retention and preservation standards, post such standards are legislated by Government. The analogy in the judgment of Pirate Bay case, which convicted the owners of the torrent site, can be used to decipher “intention” of an intermediary in CPC. It held that, as the owners of the website made it clear through their message board that their intention was to distribute copyright infringing material they were to be held liable.
Further, a clear-cut distinction between different roles, responsibilities and liability regimes of intermediaries, depending on their function, similar to German law can be adopted. It can be argued that existing horizontal approach in instances of interception, decryption, monitoring content, receiving or storing information can be feasible for an intermediary as prescribed, instead of pre-filtering it. For pre-filtering, legitimate state control online becomes relevant in the context of CPC. Here, the issue is less whether the state has the right to assert control over such material, but rather the most effective means of combating the problem it represents, and the problems to which it leads, without undercutting rights guaranteed to citizens.
It would be difficult to ascertain if CPC will be blocked by intermediaries, without giving any opportunity for hearing, treating it as a matter of importance where delay in blocking can result in fatal consequences as a general rule, rather than an exception. Post Kamlesh Vaswani petition scenario seeking a nation-wide ban on internet porn, India witnessed disablement of 857 websites’ URLs by Department of Telecommunications (“DOT”) under Section 79(3)(b) of IT Act. There was a conflict between directions received by the ISPs by the Government and validity of blocking a website with objectionable content vis-à-vis URL with objectionable content were also contested. Spurring litigations around intermediary liability and no singular convictions on CSAM worsen the hydra-headed menace.
Way Forward
Firstly, cost-benefit analysis of policy, intermediary structure revamp, filtering mechanism and upgradation of the network should be addressed.
Secondly, adequate wellness and resilience counselling to be provided to content moderators of intermediaries’ online abuse teams.
Thirdly, safety security initiatives including hashing and image-matching technology can be used by intermediaries to track the distribution of images of child sexual exploitation, identify new leads for clamping down perpetrators and prevent its recirculation, like Photo DNA Technology adopted by Facebook, Google and Microsoft.
"Disclaimer": The above views are personal, and do not constitute legal advice.