Analyze text and images for harmful content using Azure AI Content Safety (@azure-rest/ai-content-safety). Use when moderating user-generated content, detecting hate speech, violence, sexual content, or self-harm, or managing custom blocklists.
Add this skill
npx mdskills install sickn33/azure-ai-contentsafety-tsComprehensive Azure content moderation guide with excellent examples and helper functions
npx mdskills install sickn33/azure-ai-contentsafety-ts
npx mdskills install sickn33/azure-ai-contentsafety-ts