Introduction:

Undoubtedly, the internet is a transformative force that has reshaped and revolutionized how we access knowledge, conduct business, and maintain relationships. It has dismantled geographical barriers, allowing instant global communication, facilitating e-commerce, and fostering a sense of interconnectedness among people across the globe.

Meanwhile, we must also remain acutely aware of the consequences it can entail, particularly concerning the younger generation who may find themselves vulnerable to its addictive qualities. The addictive nature of the internet, coupled with the constant influx of content and the allure of social media, can impose a significant risk to the well-being of children. As they navigate this digital landscape, the potential for addiction, distraction and keen exposure to inappropriate content becomes a growing concern for parents, educators and society as a whole.

A national survey conducted in India found that six out of ten youngsters between the ages of 9 and 17 spend over three hours daily on social media or gaming sites. The survey revealed that prolonged social media use increases the risk of mental health problems such as depression and anxiety among children. Additionally, children exhibited signs of aggression, impatience, and hyperactivity after spending extended periods on social media.1

Harsh Reality in this digital realm: The National Human Rights Commission

The National Human Rights Commission (NHRC) acknowledges the substantial impact of the internet in contemporary society. However, it cannot overlook the unfortunate reality that is, in certain realms such as child sexual abuse, it has facilitated considerable harm. The production, distribution and consumption of Child Sexual Abuse Material (CSAM) represent one of the most abhorrent forms of sexual harassment and abuse endured by children, thereby violating their fundamental human rights.

In light of the distressing increase in the proliferation/dissemination of CSAM, the NHRC has taken a proactive step by issuing an advisory intended to confront this urgent issue comprehensively. This advisory recognizes the urgent need to protect children from the adverse consequences of CSAM and underscores the NHRC's commitment to safeguarding their rights and dignity in the digital environment.

Statistics:

According to the National Center for Missing and Exploited Children (NCMEC) 'CyberTipline 2022 Report', out of the 32 million reports received by NCMEC, 5.6 million reports pertained to CSAM uploaded by perpetrators based out of India. A total of 1505 instances of publishing, storing and transmitting CSAM under Section 67B of the Information Technology (IT) Act, 2000 and Sections 14 and 15 of the Protection of Children from Sexual Offences (POCSO) Act, 2012 had been reported in the year 2021.

The production of CSAM establishes an enduring record of sexual abuse, while its subsequent dissemination through the internet and other channels perpetuates the victimization of children.

This continuous exposure has a profound and lasting psychological impact on the child, disrupting their overall development. Therefore, the pressing necessity lies in the effective identification and blocking of CSAM content, prompt data sharing among concerned stakeholders, and the swift prosecution of offenders.

Recently, it was brought into news that 41 States in the US sued Meta, alleging that the tech giant harms children by integrating addictive features into social media applications and compromise the underscoring a significant stride by state enforcers to tackle the impact of social media on children's mental health. A 233- page federal complaint alleges that the company engaged in a 'scheme to exploit young users for profit' by misleading them about the safety features and the prevalence of harmful content, harvesting their data and violating federal laws on children's privacy. The tech giant rejected the investigation at the time saying the allegations were false and demonstrate a deep misunderstanding of the facts. Since then, Meta has unveiled numerous policy and product changes intended to make its apps safer for children, including giving parents tools to track activity, building in warnings that urge teens to take a break from social media and implementing stricter privacy settings by default for young users.2

Advisory for Protection of the Rights of Children against CSAM: NHRC

Given that the NHRC is entrusted with the duty of safeguarding and advocating for the human rights of all individuals, including children, under Section 12 of the Protection of Human Right Act, 1993, it has taken significant strides to shield them in the digital realm. This has culminated in the issuance of the following advisory, consisting of four parts, titled "Advisory for Protection of the Rights of Children against Production, Distribution and Consumption of Child Sexual Abuse Material (CSAM)."3

  1. Part I deals with addressing legal gaps and issues of harmonization of laws pertaining to CSAM:
    • One significant aspect of this part of advisory is the proposal to replace the term 'Child Pornography' with 'Child Sexual Abuse Material (CSAM)' in Section 2(1) (da) of the POCSO Act, 2012. The NHRC firmly believes that more accurate terms such as 'use of children in pornographic performances and materials', 'child sexual abuse material', and 'child sexual exploitation material' should be preferred over 'Child Pornography'.
    • Furthermore, it calls for the government to redefine the term 'sexually explicit' under Section 67B of the IT Act, 2000 to facilitate the improved identification and removal of CSAM along with expanding the definition of 'intermediary' by including Virtual Private Network (VPN) service providers, Virtual Private Servers (VPS) and Cloud Service Providers to avoid any ambiguity.
    • The advisory also addresses the need to reconsider the quantum of punishment, taking into account the gravity of the offences pertaining to online CSAM under Section 14 of the POCSO Act and Section 67B of the IT Act related to CSAM, especially when the sentences are seven years or less. It suggests revaluation or potential legislative changes to Section 41 A of the CrPC.
    • The advisory also suggests reviewing the necessity of issuing a certificate under Section 65B of the Indian Evidence Act, 1872 for online CSAM cases aiming to prevent delays and expedite investigation process.
  2. Part II contains measures for monitoring and regulating Internet Intermediaries, including use of technology to monitor CSAM content online, sharing of information and cooperation with the Government:
    • The advisory calls for a fundamental shift in terminology, recommends leveraging technology and platforms such as social media and Over-The-Top (OTT) services, to proactively detect CSAM on their platforms and remove the same. Similarly, platforms using End-to-End Encryption services may be mandated to devise additional protocols/technology to monitor circulation of CSAM.
    • Intermediaries, including social media platforms, Over-The-Top (OTT) applications, and cloud service providers are urged to develop a CSAM specific policy. This policy shall clearly outlines a user-friendly in-house reporting mechanism, notification of a dedicated point of contact, standardized response time and use of technology for detection and removal of CSAM from their respective platforms. The said policy should be made in consultation with Government and conveyed to the users by prominently displaying the same.
    • The advisory emphasizes that given the rapid dissemination of online CSAM, the duration for intermediaries to remove content after receiving notifications from the appropriate government or authorised agencies should not exceed 6 hours. This is a substantial reduction from the existing 36 hours window specified in Rule 3 (1) (d) of the Intermediary Guidelines, 2021. The swifter response time is essential to combat the urgent challenges posed by the circulation of such harmful material.
    • Further, the advisory urges internet service providers, web browsers and OTT players in displaying pop-up warning messages when users search for content related to CSAM. This measure is crucial to raise awareness and preventing inadvertent access to harmful content, thereby contributing to the overall effort to combat the spread of CSAM.
  3. Part III pertains to creation of a specialized mechanism of law enforcement for addressing CSAM as well as strengthening the existing mechanism involved in detection, investigation and monitoring of CSAM.
    • The advisory emphasizes that every State/UT to have at least one Specialized State Police Unit for detection and investigation of CSAM related cases and apprehension of offenders.
    • The advisory advocates for the creation and maintenance of a national database of such material with hash values by the proposed Specialized Central Police Unit so that the required content be blocked by intermediaries.
    • In accordance with this advisory, the proposed Specialized Central Police Unit should be responsible for collection of detailed data pertaining to prevalence, trends and patterns of CSAM. This data should be disaggregated by factors such as gender, age, caste, ethnicity or other socioeconomic parameters which would aid in enhancing better understanding of the issue, thereby allowing for more informed policy-based interventions addressing the complexities surrounding CSAM comprehensively.
    • Further, it also recommended to establish a separate dashboard on issues pertaining to CSAM related offences to be incorporated in the Inter-Operable Criminal Justice System (ICJS) / Crime and Criminal Tracking Network & Systems (CCTNS) database which is issued by National Crime Records Bureau (NCRB) for Investigation Tracking System for Sexual Offences (ITSSO).
  4. Part IV recommends measures for capacity building and training of officials, sensitization, awareness and support to survivors of CSAM.
    • It stressed on creating awareness and sensitization among students, parents and teachers at schools, colleges and institutions on the modus operandi of online child sexual abusers, specific vulnerabilities of children online, reporting mechanisms, recognizing early signs of online child abuse and grooming through emotional and behavioral indicators, use of parental control apps, Internet safety among children through different means, like conducting workshops.
    • Further, the advisory emphasizes that SMS alerts must be sent to every mobile application through Telecom Service Providers every quarter/month cautioning users about CSAM.

Conclusion:

The NHRC has advised all concerned authorities of the Union/State Government(s)/UT Administration(s) to implement the recommendations contained in this comprehensive advisory, both in letter and spirit, and to submit an Action Taken Report (ATR) within a period of two months, emphasizing the urgency of this matter.

The NHRC's recommendations underscore the pressing need to combat the dissemination of CSAM and protect the rights of children. They emphasize the pivotal role of technology, legal frameworks and terminology in addressing this deeply concerning issue. By implementing these measures, the NHRC aims to significantly enhance efforts to eradicate the production, distribution and consumption of Child Sexual Abuse Material, ultimately ensuring a safer environment for children in India.

Footnotes

1. https://timesofindia.indiatimes.com/city/mumbai/60-children-spend-3-hours-a-day-onsocial-media-study/articleshow/103878956.cms

2. https://www.washingtonpost.com/technology/2023/10/24/meta-lawsuit-facebookinstagram-children-mental-health/

3. https://nhrc.nic.in/sites/default/files/Advisory%20on%20CSAM_Oct2023.pdf

For further information please contact at S.S Rana & Co. email: info@ssrana.in or call at (+91- 11 4012 3000). Our website can be accessed at www.ssrana.in

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.