More stories

Hive to be Lead Sponsor of Trust & Safety Summit 2025
We are thrilled to announce that Hive is the lead sponsor of the Trust & Safety Summit 2025, Europe’s premier conference for T&S leaders.

Protecting Children’s Online Safety with Internet Watch Foundation
Hive is proud to announce that we are partnering with Internet Watch Foundation (IWF), a non-profit organization working to stop child sexual abuse online.

Forbes
This $2 Billion Content Moderation Company Is Trying To Stop AI Images Of Child Sexual Abuse

State of the Deepfake: Trends & Threat Forecast for 2025
We at Hive are excited to share a new report we’ve written on the state of the deepfake—covering its evolving trends, threats, and beyond.

Expanding our Moderation APIs with Hive’s New Vision Language Model
Hive has released Moderation 11B Vision Language Model, which offers a powerful way to handle flexible and context-dependent moderation scenarios.

Announcing Hive’s Partnership with the Defense Innovation Unit
Hive announces that we have been awarded a landmark Department of Defense (DoD) contract for deepfake content detection.