Moderate - Trust & Safety
Visual Moderation
Text Moderation
Audio Moderation
CSAM Detection
Demographic Attributes
Detect Objects & Scenes
OCR
Contextual Scene Classification
Detect AI Content
AI-Generated Content Classification
Detect People & Identity
Likeness Detection
Logo & Logo Location
Celebrity Recognition
Generate
Image Generation
Video Generation
Text Generation
Multimodal Language Model
Translate
Speech-to-Text
Translation
Search
Custom Search
Reverse Image Search
Media Search
Contextual Search
NFT Search
Platform
Custom Training - AutoML
Moderation Review Tool
NVIDIA NIM
Technology & Digital Platforms
For Online Communities
For Streaming Platforms
For Marketplaces
For Generative AI Apps
For Gaming Companies
For Dating Apps
Sports, Media, & Marketing
For Brands
For Publishers
For Agencies
For Teams and Leagues
Risk & Identity Management
For Insurance Companies
For Financial Services
Use Cases
Content Moderation
Sponsorship Intelligence
Ad Intelligence
Context-Based Ad Targeting
Careers
About Us
Back
Hive has integrated NCMEC’s CyberTipline into Moderation Dashboard, streamlining the process of submitting child sexual abuse material (CSAM) reports.
Hive provides an in-depth comparison of all of our products against top competitors.
Hive is proud to announce that we’ve achieved both ISO 27001:2022 and SOC Type 2 certifications.
How Easy Is It to Fool A.I.-Detection Tools?
Hive presented at CVPR last week, where we highlighted a few important considerations when building ML models for classification tasks.
Hive announces a new AutoML tool to train, evaluate, and deploy customized machine learning models.
AI-Created Images Are So Good Even AI Has Trouble Spotting Some