BACK TO ALL BLOGS

Streamline CSAM Reports with Moderation Dashboard’s NCMEC Integration

Contents

Hive is excited to announce that we have integrated the National Center for Missing & Exploited Children’s (NCMEC) CyberTipline into Moderation Dashboard, streamlining the process of submitting child sexual abuse material (CSAM) reports. This feature is now available to all Moderation Dashboard customers with valid NCMEC credentials.

Ensuring Child Safety Online

The National Center for Missing & Exploited Children is a non-profit organization dedicated to protecting children from all forms of exploitation and abuse. All electronic communication service providers are required under U.S. federal law to report any known CSAM on their platforms to NCMEC’s CyberTipline—a centralized system for receiving and processing CSAM reports. These reports are later shared with law enforcement and relevant service providers so they can take further action.

Throughout our endeavors and partnerships, Hive’s commitment to online safety has been unwavering. We built this integration to help automate the reporting process, simplify our customers’ workflows, and ensure that their platforms can comply with applicable law.

Integration Workflow

A step-by-step sample integration workflow is outlined, starting from when a user uploads an image to the platform and ending in the subsequent actions a moderator can take. For a more detailed guide on how the reporting process works, refer to the following documentation.

  1. A user uploads an image to the platform.
  2. The image is processed by Hive’s proprietary CSAM Detection API, powered by Thorn—a leading nonprofit that builds technology to defend children from sexual abuse. To learn more about our Thorn partnership, read our blog posts linked below:
  3. If there is a likelihood of CSAM detected in the image, this image will surface as a link in the CSAM Review Feed. Once the link is clicked, the media will open in a new browser tab for the moderator to review. Moderation Dashboard will never display CSAM content directly within the Review Feed.
  4. From the review feed, the moderator can take two actions:
    • Perform an enforcement action (e.g. banning the user or deleting the post). A webhook is sent to the customer’s server afterward, containing the moderator’s chosen enforcement action as well as the post and user metadata, all of which are used to take the content down.
    • The system will automatically create a report, which the moderator can send to NCMEC by clicking the “Submit” button within the Review Feed. After the report is submitted, the system creates an internal log to track the report (e.g. submission date and time, as well as storing the response from NCMEC).
“Report to NCMEC” button within Review Feed

NCMEC Report Contents

Customers can pre-fill information fields that are constant across reports. These fields will be automatically populated for each report, reducing effort on the customer’s end. To provide our customers with full transparency, the report sent to NCMEC includes: the moderator’s information, the company’s information, the potential CSAM content, and the incident date and time.

Moderator information fields

If you’re interested in learning more about what we do, please reach out to our sales team (sales@thehive.ai) or contact us here for further questions.

BACK TO ALL BLOGS

Super Bowl LIX – As Seen By AI

Next-day insights on the latest trends in marketing and culture, powered by Hive’s AI models. For more detailed analytics, download the full report below.

Key Insights

Were They Here Last Year?

Brands not active during last year’s Super Bowl made up 51% of the airtime for nationally-televised ads during this year’s Big Game.

Meet My Famous Friends

Celebrity integration into Super Bowl commercials has become a cornerstone of the creative decisions for many brands. This year was no different, with 60% of ads featuring at least 1 celebrity, an increase from 50% last year. Actors and actresses continue to be the most common type of celebrity to be cast in Super Bowl commercials.

EVs Unplugged

This year’s Super Bowl featured the lowest count (1) and lowest percentage of auto ads (50%) referencing electric vehicles since at least 2020.

A Part Of The Game

27 brands earned more than 5 seconds of screen time within the game and postgame telecast (excluding pregame and commercials), totaling almost two hours of cumulative screentime worth $247.8M in equivalent media value.

BACK TO ALL BLOGS

Hive to be Lead Sponsor of Trust & Safety Summit 2025

We are thrilled to announce that Hive is the lead sponsor of the Trust & Safety Summit 2025.

As Europe’s premier Trust & Safety conference, this summit is designed to empower T&S leaders to tackle operational and regulation challenges, providing them with both actionable insights and future-focused strategies. The summit will be held Tuesday, March 25th and Wednesday, March 26th at the Hilton London Syon Park, UK.

The 2-day event will explore themes such as regulatory preparedness, scaling trust and safety solutions, and best practices for effective content moderation. An incredible selection of programming will include expert-led panels, interactive workshops and networking events.

Hive’s CEO Kevin Guo will deliver the keynote presentation on “The Next Frontier of Content Moderation”, covering topics like multi-modal LLMs and detecting AI generated content. Additionally, Hive will host two panels during the event: 

  • Hyperscaling Trust & Safety: Navigating Growth While Maintaining Integrity. Hive will be discussing best practices for scaling trust & safety systems for online platforms experiencing hypergrowth.
  • Harnessing AI to Detect Unknown CSAM: Innovations, Challenges, and the Path Forward. Hive will be joined by partners Thorn and IWF to discuss recent advancements in CSAM detection solutions.

As the lead sponsor of the T&S Summit 2025, we are furthering our commitment to making the internet a safer place. Today, Hive’s comprehensive moderation stack empowers Trust & Safety teams of all sizes to scale their moderation workflows with both pre-trained and customizable AI models, flexible LLM-based moderation, and a moderation dashboard for streamlined enforcement of policies. 

We look forward to welcoming you to the Trust & Safety Summit 2025. If you’re interested in attending the conference, please reach out to your Hive account manager or sales@thehive.ai. Prospective conference attendees can also find more details and ticket information here. For a detailed breakdown of summit programming, download the agenda here

To learn more about what we do at Hive, please reach out to our sales team or contact us here for further questions.