End to End encrypted Child Sexual Abuse Material: An overview of the report
Cyber Peace Foundation has recently published a comprehensive report, primarily focused on the proliferation of Child Sexual Abuse and Exploitation Material (CSAM & CSEM) on the End-to-End Encrypted (E2EE) platforms such as WhatsApp and Telegram. This blog will briefly touch upon the Focus of study, objectives and the Rationale behind the Study, while also highlight the findings of the Study.
The rationale and inspiration behind the study
The world seldom comes together and joins hands to fight against a social evil. A global collaborative effort to fight against CSAM and CSEM is one such instance where the world has agreed upon to join hands. This ‘global collaborative effort’ against this menace known as CSAM, can be observed from various Legislations and Directions such as the ‘directive 2000/31/EC’ on ‘electronic commerce’ issued by the European Union. This directive established a Liability of an Internet Service Provider, in a scenario where they are aware of ‘an illegal activity’ on the platform and does not take any action regarding the same. Another example for the same could be how Companies based in the United States, have to report such materials and content to National Centre for Missing and Exploited Children (NCMEC), which operates closely with Law Enforcement and Federal Agencies to curb this issue.
However, most of these Global Efforts are only effective on the surface web resources, and not on the Platforms, which are End-to-End encrypted (E2EE). These E2EE platforms provide a level of anonymity to the source of these materials and at the same time, an ineffective Reporting Mechanism can be useful for the perpetrators, as the platform cannot access the Data which being reported, as the same is Encrypted. There have also been many collaborative efforts with various private entities as well; however, the report highlights that how most of the concrete principles to fight this menace is only functional on the surface web and not on the E2EE platforms. This ineffective present policy and technological framework to report CSAM & CSEM became the basis and rationale behind the report to find the exact extent of the problem
Methodology and Focus of research
As highlighted above, the objective of this study, conducted during June and July 2020, was to study and assess the issue of CSAM on E2EE platforms while also studying and analyzing the legal, technical and policy frameworks around the world, which prevent the spread of CSAM. This study also presents findings of the research conducted on various social media and messaging platforms which have the E2EE technology built into it. The research and findings were primarily focused on presenting the situation with various features of the popular messaging platforms in India, WhatsApp and Telegram and checking their responsiveness towards posting, and subsequent reporting, of such content. While extensively researching these applications, the researchers of the study also found the Gaps and Challenges of various technologies and policies associated with CSAM.
Findings of the Report
WhatsApp: The research proceeded with the findings of a previous research which was especially focused on Finding CSAM on WhatsApp. The initial findings reported that, out of 182 WhatsApp groups links, which were reported back in 2019, for containing pornographic content or in some scenarios pornographic display pictures, 110 groups were still operational. Some of these groups were still actively sharing CSAM even as of June 2020. The study also identified additional 1299 other Whatsapp groups which were sharing Adult Pornographic Content, and the links to the same were publicly available. Out of these 1299 groups, 29 groups were further studied in-depth and over 100 instances of CSAM were identified on these groups.
Telegram: The research began with Finding over 350 publicly available links to certain channels which spread pornographic content. Out of these 350 channels, which are readily available in the public domain, only 283 channels seemed to be operational. These channels mostly contained sexually content produced with adults and big production houses of the Adult video industry. However certain self-recorded videos, content for solicitation of sex work and some thumbnails and images with links to Sexually Explicit Content containing Minors were also observed on these channels. Around 23 instances were found, where the users were proliferating CSAM. The research also found that about 85% channels, which were reported for spreading pornography or child abuse materials, were shut down.
While this responsive and decent reporting mechanism available on Telegram is very helpful for the users and victims, it will still be recognized as reporting on the surface web as these channels, which are being reported, are publicly assessable. However, the case is not the same with private and secret conversations, which are encrypted. In fact, this problem exists with both platforms, i.e Whatsapp as well, as the reporting of E2EE chats and profiles are not effective and usually, no action is taken.