Restricting dissemination of Child Pornography: Role played by Internet Intermediaries

0
1104
Pic Credit: South China Morning Post

-Shrey Fatterpekar
Counsel, Bombay High Court

There can be no quarrel with the proposition that child pornography and sexual exploitation of children is a problem which needs to be dealt with strongly. With more people accessing the internet than ever before, information is capable of being circulated widely in a matter of minutes. The Delhi High Court in Ms. X v. State & Ors.[i]was concerned with a case involving the dissemination of child pornography/child sexual abuse materials on the internet. In this case, the role played by intermediaries such as Facebook and Google and the steps taken by them to curb such dissemination were considered by the Court.

Intermediaries: Whether liable?

Under Section 79 of the Information Technology Act, 2000, an intermediary like Facebook or Google is not liable for any information hosted on its platform provided the information is made available by a third-party. This would include child pornography or child sexual abuse materials posted by third-parties. The protection afforded to intermediaries is however conditional, i.e. subject to fulfilment of conditions prescribed under Section 79. One such condition prescribed under Section 79(3)(b) is that the intermediary, upon receiving actual knowledge or on being notified by an authority of any information being unlawful, must expeditiously remove or disable access to such information. The requirement of knowledge laid down under Section 79 has been read down by the Supreme Court in Shreya Singhal v. Union of India[ii] to mean knowledge that a court order has been passed asking the intermediary to remove or disable access to such information. In Ms. X’s case, the intermediaries had been previously ordered to ensure removal of such material identified by Ms. X and her lawyers in her petition. This order had been complied with and accordingly, the intermediaries were entitled to the protection conferred upon them by Section 79.

Responsibility of intermediaries to prevent such information being made available

The Court then considered if the intermediaries were responsible to ensure that child pornography or child sexual abuse materials were not hosted on their platforms. The Court observed that under Section 20 of the Protection of Children from Sexual Offences Act, 2012 and Rule 11 of the Protection of Children from Sexual Offences Rules, 2020, an intermediary was inter alia required to report to the Special Juvenile Police Unit in case of child pornography or child sexual abuse materials being made available on their platforms.

In this regard, both Facebook and Google filed affidavits before the Court inter alia highlighting the steps they have taken to deal with this issue. Both platforms stated that they have implemented a strict user policy which prohibits posting of child pornography or child sexual abuse materials. In addition, it was stated that they were actively working with the enforcement agencies to combat this issue.

In addition, Facebook stated that it was using ‘Photo DNA’ to identify any known/apparent child pornography image and worked on active identification of key words related to such content. Google stated that it was using ‘video hashing’ technology to prevent re-uploads of identical copies of videos removed for violation of their Community Guidelines. It also used Artificial Intelligence (AI) and Machine Learning Tools to deal with child pornography content on its platforms. At the time of hearing, it was explained that though AI was being used to remove such content, the technology had inherent limitations as any image even if slightly altered would escape detection.

In view of the statutory provisions set out above and the use of AI by the intermediaries, the Court held that intermediaries were required to ensure that child pornography or child sexual abuse materials are not hosted on their platforms. The Court then directed both Facebook and Google to remove content ‘similar’ to the contents of the specific links mentioned in the petition (which links had already been disabled) with due regard to the tools available with them and the limitation of such tools.    

Implication of this decision

The intent of the Delhi High Court’s decision is laudable. Child pornography is a serious issue and requiring intermediaries to actively disable access to such information will enable the law enforcement agencies to deal with this problem in an effective manner. Ordinarily, in view of the Supreme Court’s dictum in Shreya Singhal (supra), intermediaries are required to disable access to specific content adjudged to be unlawful by courts. However, in this case, in view of the statutory provisions set out above, the Delhi High Court has directed the intermediaries to disable not only specific URL’s set out in the petition, but also content similar to those URL’s. This order thus imposes a greater burden on intermediaries like Facebook and Google to use tools at their disposal to in a way monitor information exchanged on their platforms and take down material pertaining to child pornography.

It is pertinent that this order would be applicable in cases involving child pornography only and not in other cases. This is because other cases are not as straightforward as cases involving circulation of child pornography. For example, only a court can adjudge if a particular post made on Facebook is defamatory and can direct Facebook to take such post down. However, if Facebook was thereafter broadly directed to take down materials ‘similar’ to that post, it may amount to putting Facebook in a position where it is required to decide whether any other post is defamatory or not. This would in fact be contrary to the Supreme Court’s decision in Shreya Singhal (supra) which expressly provides that an intermediary can remove information after a court directs it to do so. In fact, it was only on this basis that Section 79 of the Information Technology was held to be constitutionally valid. Similarly, this broad test will not be applicable in disputes involving intellectual property rights such as trademarks or copyrights. Whether a particular post infringes a trademark or copyright is something only a judge can adjudicate on the basis of tests formulated in law. AI, as of today, cannot perform these adjudicatory functions.

Conclusion

The decision of the Delhi High Court in Ms. X v. The State & Ors. is a step in the right direction in curbing the menace of child pornography. The intermediaries are now required to use all tools at their disposal including artificial information to restrict circulation of child pornography/child sexual abuse materials on the internet. This should in turn help the enforcement agencies prevent large circulation of such materials and in identifying the wrongdoer involved.


References:

[i] Order dated 20th October 2020 passed in W.P.(CRL.) 1080/2020

[ii] (2015) 5 SCC 1

LEAVE A REPLY

Please enter your comment!
Please enter your name here