A series of reports from tech companies show that, in 2019, there was a peak, of up to50% in comparison with last year, in the number of audiovisual material showing child sex abuse online. The data showed that the tech platforms with most users around the globe, like Facebook or Instragram, have users sharing illegal content, despite their security and privacy policies.
Information Center Filled with Pictures
The National Center for Missing & Exploited Children is an office with federal jurisdiction in the USA. It works with every public security agency. Last year, this office managed to compile, at least, 70 million pictures and videos with illegal content showing child sex abuse.
The Center reported that the amount of video shared through social media reached a record. Luckily, these illegal videos have always been popular among sex predators, and now these are more easily detected by some of the aforementioned digital companies.
More than 41 million videos were reported, an alarming figure if compared to five years ago, which it did not go over 350 thousand. Many of the pictures and videos were pinned down in multiple occasions as users received them.
The Center shared with the journal department of The New York Times which digital platforms reported the most amount of pictures of child sex abuse. The report details as follows:
Facebook: 60 million pictures and videos. The figure represents more than 85% of the total amount.
Instagram: now from Facebook, was responsible for 1.7 million additional pictures and videos.
Instagram has been the social network of choice by pedophiles; however, Facebook’s Messenger is taking the lead; the number of user can be reason of the increase.
Facebook claims that half the content found does not fully qualified as illegal, but it was reported as such to support the investigations from different public security institutions on this matter.
Snapchat, Twitter, and other social network companies also handed their reports on the pictures taken on their platforms. Moreover, search engines like Google, Microsoft, Dropbox, and the chat platform, Discord, also detected illegal content.
Children are not Adult Entertainment
An adolescent notified the adult web site, Pornhub, which the video of her rape was posted on the platform with very explicit taglines about what the users would watch. Not only the victim had to face the trauma for the rape, but the online exposure made her relive the attack over and over again.
The crime happened back in 2009, when the victim was 14 old. Two men recorded certain moments of the rapping and post them on the web site. Few months later, several people of her school shared the link of Pornhub, where there was various videos posted of the attack she was submitted to.
It was not until she pretended to be a lawyer ready to sue that the aforementioned web page removed the videos. For the victim, each of the 400 thousand views felt as a new attack.
The web site stated that accusations for videos with illegal content date back to 2009, with the previous headboard, and nowadays there are more strict measures and policies to fight unauthorized and illegal content as part of their stand against promotion of material displaying child sex abuse.
Currently, digital platforms use sophisticated technologies based on the experiences of their workers to detect, report, and eliminate any a/v material with illegal content or with child abuse, as well any attempt of sharing said content. Many companies train their own staff, others use third parties like Vobile, specialist in digital identification, who compares all new posts against potentially unauthorized material, and it makes sure that original video does not make it into the platform.