CSAM Online: As revealed earlier, Facebook said that it has removed 8.7 million of the sexually exploitative child images in just three months. Following this, in May 2019, Twitter had announced that it has successfully suspended 4, 58, 989 accounts for the violation that is related to the child sexual exploitation on its social media platform. While six months ago, WhatsApp said that it has removed 1, 30, 000 accounts in just 10 days disbursing the child sexual abuse material or child pornography. This makes it clear that one does not necessarily need to go on the dark web sites to enjoy or share the Child Sexual Abuse Material (CSAM). This has also come to light that India is one of the largest contributors and consumers of child sexual abuse material or CSAM while the law states that it is completely illegal.

Protection of Children from Sexual Offences (POCSO) Act (sections 13-15) clearly states that creating real or simulation as well as storing CSAM for commercial purposes is illegal. Furthermore, the section 67B of the Information Technology Act bars the publishing and as well as sharing material depicting a child in a sexually explicit act in any electronic form. Browsing, downloading, advertising, promoting, exchanging and distributing of such material in any form is also prohibited under the Act keeping in mind how this would adversely affect the generation. The maximum punishment for these offences is imprisonment for seven long years.

Nitish Chandan who is a cybersecurity specialist has pointed out that while there may be difficulties in zeroing in on the age of older children in CSAM on these social media platforms, there is advanced technology and AI that companies are using to identify and remove CSAM especially when it comes to younger children throughout the globe. Further, Siddharth Pillai of Aarambh, an online portal which works on online safety and combats CSAM online, adds that ambiguity rises when the thumbnails on videos or images are merely suggestive providing little information about what is inside and moreover intriguing people to know about it deeply by accessing the videos.

Siddharth and Nitish both agree on the fact that like everything else, CSAM continues to persist because of its high demand. In his research, Nitish has found out that people do not even bother to mask their numbers or profile photos while asking for “cp” or the child porn on WhatsApp and Telegram groups. It also seems that people are unaware that soliciting, downloading and storing CSAM is also illegal. Further, the demand for CSAM is very real owing to the number of child porn released every single minute and the number of consumers using them. Aarambh, which allows people to report CSAM on its website, found out that in 70% reported cases, the videos clearly featured children below 10 years of age.

Disclaimer: Darkweblink.com does not promote or endorse claims that have been made by any parties in this article. The information provided here is for the general purpose only and unintended to promote or support purchasing and/or selling of any products and services or serve as a recommendation in the involvement of doing so. Neither Darkweblink.com nor any member is responsible directly or indirectly for any loss or damage caused or alleged to be caused by or in relation with the reliance on or usage of any content, goods or services mentioned in this article.

LEAVE A REPLY

Please enter your comment!
Please enter your name here