Intelligence
NSFW
Overview
The nsfw task detects explicit or unsafe content in images and videos.
It analyzes the visual content and returns boolean flags, severity levels, and descriptions for categories including nudity, sexual content, violence, and gore.
Creating an NSFW task
The nsfw task has no configurable options — all analysis is returned by default.
Output
When the task succeeds, the output JSON file contains:
| Field | Type | Description |
|---|---|---|
nudity | boolean | Whether nudity was detected |
sexual | boolean | Whether sexual content was detected |
violence | boolean | Whether violence was detected |
gore | boolean | Whether gore was detected |
categories | string[] | List of detected categories |
severity | object | Severity level per category (none, mild, moderate, severe) |
description | string | Human-readable description of the findings |
timeline | array | For video: timestamped detections with start, end, category, severity |
Supported inputs
NSFW tasks work with:
- Image:
.jpg,.jpeg,.png,.webp,.avif - Video:
.mp4,.mov,.webm
Common use cases
- User-generated content moderation
- Automatic content filtering before publishing
- Flagging or blurring unsafe media
- Age-restricted platform compliance
Example automation
You can combine NSFW detection with an automation to process all new uploads: