Twitter alone was responsible for half of the reported cases of child abuse images found online in the past three years. Indecent material even found its way onto some of the websites of the largest tech companies in the UK, proving that their security measures are ineffective.
The actual number of child abuse images and videos discovered on the net is unknown as most reported incidents include backlinks to child abuse websites, meaning that every single report could in all probability represent thousands of indecent images.
The IWF said, “figures show the number of child abuse images and URLs being openly hosted on popular sites was increasing year-on-year, with 742 incidents found in 2016, 1,016 in 2017 and 1,077 in 2018.”*
In 2018 Facebook, Instagram and WhatsApp were responsible for 4000 reported incidents according to research done by the NSPCC.The incorporation of end to end encryption technology in most messaging services certainly increases the number of child abuse incidents that go unreported each year. The biggest social media platform in the world, Facebook, takes a particularly solid stance on the need for such encryption technology.
“People’s private communications should be secure. End-to-end encryption prevents anyone — including us — from seeing what people share on our services.” — Mark Zuckerberg Facebook CEO.
It’s undoubtedly a fine line to tread. Where do we balance the need for privacy and the need to protect our children from sexual predators? Are Facebook and WhatsApp giving those who might harm our children a haven, a secure place to spread indecent material online?
On the other hand, the IWF has no problem with detecting and sharing such information. As of the 12th December 2019, the agency announced a ground-breaking ruling which allows them to share their discoveries with agencies in the USA. The IWF generates PhotoDNA (pDNA) hashes of child sexual abuse images. Essentially, hashes are digital fingerprints or breadcrumbs which help agencies follow the trail to the source of indecent material.
Also, Apple has started to scan photos uploaded to iCloud in their attempt to answer questions about what role its technology plays in the prevalence of child abuse imagery shared on the web.
The company even openly states on its website that they intend to support innovation to help better protect children online and will continue to work with agencies who share their vision.
Similar to Facebook and their WhatsApp messaging system Apple has come under fire for its stance on data protection in the past. They have even refused to break into criminals’ phones, stating it as a breach of privacy. However, this U-turn is a much-needed step in the right direction.
The problem is clearly on the minds of some of the most prominent players in the tech industry, but their hands are often tied due to data protection laws. However, if common sense is to prevail, more work is needed to help identify malicious content early, way before it ever reaches the servers.
Please help us to help you and your children abolish child abuse online.