A new study by the Canadian Centre for Child Protection shone a light not on the big players or social networks, but providers with less coverage such as filehosting and imagehosting services.
It impressively confirms jugendschutz.net’s own findings on the issue: The above-mentioned services are used massively to provide CSAM and the providers didn’t take much care to prevent that. Additionally, links to the content can often be found on other services where offenders connect each other, such as forums, social media or the dark web.
One of the key findings was that 48% of the content previously identified and reported by Project Arachnid popped up on the exact same services again, even though they were formerly taken down for violating certain laws. By simply hashing the content against a hash list like the one offered by Project Arachnid, these uploads would have never happened.
Due to the findings in this recent study, the Canadian Centre for Child Protection compiled a catalogue of recommendations.
Excerpt:
- Enact and impose a duty of care, along with financial penalties for non-compliance or failure to fulfill a required duty of care.
- Require automated, proactive content detection for platforms with user-generated content.
- Impose certain legal/contractual obligations in the terms of service for electronic service providers and their downstream customers.
- Set standards for content that may not be criminal, but remains severely harmful-abusive to minors.
- Establish standards for user-reporting mechanisms and content removal obligations.