logo

Hive AI

CSAM Detection API

CSAM Detection API

Thorn's industry-leading CSAM detection technology is seamlessly integrated into Hive's content moderation suite. By leveraging Thorn's expertise in child protection and Hive's cutting-edge AI, this integrated solution provides an effective approach to identifying and addressing CSAM within digital ecosystems.
Proactively detect known and new CSAM at scale

Proactively detect known and new CSAM at scale

CSAM is a serious risk

Platforms with user-generated content face challenges in preventing child sexual abuse material (CSAM). Failing to address CSAM risks can seriously impact a platform's stability and viability.

Volume:The sheer volume of uploads makes manual review prohibitive.
Reputation:Reputational damage can be swift and lasting, impacting user retention and revenue.
Compliance:Regulatory scrutiny can lead to costly compliance measures and allowing CSAM to proliferate can lead to legal consequences.
step-divider
Protect your platform

Built by experts in child safety technology, Safer is a comprehensive solution that detects both known and new CSAM for every platform with an upload button.

Industry-leading CSAM detection by Thorn

Harness industry-leading CSAM detection developed by Thorn, a trusted leader in the fight against online child sexual abuse and exploitation.

Seamless integration
through Hive

Process high volumes with ease with Hive’s real-time API responses. Model responses are accessible with a single API call.

Industry-leading CSAM detection by Thorn

Using two equally important technologies – hash matching and artificial intelligence (AI) – Safer detects both known and unknown CSAM. Safer provides a layer of protection for your company’s brand, reputation, and user experience.
AI Classification Models

Advanced AI:Detect new and previously unreported image and video CSAM with state-of-the-art machine learning classification models (“classifiers”).
High-quality training data:Thorn’s classifiers were trained on actual CSAM – in part using trusted data from the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline.
Proprietary Hashing & Matching

Trusted datasets:Identify known CSAM using proprietary hashing and matching against 57.3 million known CSAM hash values.
Video Hash Matching:Thorn’s proprietary perceptual scene sensitive video hashing (SSVH) technique splits videos into scenes and frames to identify CSAM with precision.
Seamless integration through Hive

Seamless integration through Hive

Speed at scale
Speed at scale

We handle high volumes with ease and efficiency, serving responses to billions of API calls per month.

Simple integration
Simple integration

Hash matches and model responses are accessible with a single API call. Integrate Thorn's Safer into any application with just a few lines of code.

Proactive updates
Proactive updates

Thorn maintains a database of 57M+ known CSAM hashes and receives daily NCMEC updates.

How you can use the CSAM Detection API today

Image

For an image, simply send the image to us, and we will hash it using MD5 and Safer encryption algorithms. Once the image is hashed, we return the results in our output JSON. We also return the classifier score in the same response.

Video

For videos, we use MD5 hashes and Safer's proprietary perceptual hashing. MD5 will return exact match videos and will only indicate whether the whole video is a known CSAM video. Additionally, Thorn's proprietary perceptual scene sensitive video hashing (SSVH) technique splits videos into scenes and frames to identify CSAM with precision. We also return the classifier score in the same response. Note: For the Safer SSVH, videos are sampled at 1FPS.

Note: All submitted content is deleted immediately, ensuring nothing stays on Hive’s servers.
thorn-model-steps-sm

Model Response

thorn-response

Ready to build something?

AI Models

Applications

Platform Solutions

Media Solutions

Company

Other Site Pages

Contact Us

footer-hive-logo
© Copyright 2024