Image Moderation
Image Moderation detects and recognizes inappropriate and unsafe content in images. The criteria include suggestive or explicit racy content, nudity, violence, gore, bloodshed, and the presence of weapons and drugs.
You can provide a .jpg/.jpeg or .png file as the input. Refer to the API documentation for the request and response formats.
You can set the moderation mode as BASIC, MODERATE, or ADVANCED optionally. The image is processed in the ADVANCED mode by default.
The response returns the probability of each criteria with their confidence scores, and the prediction of the image being safe_to_use or unsafe_to_use.
The zia reference used below is defined in the component instance page.The promise returned here is resolved to a JSON object.
copylet fs = require('fs'); zia.moderateImage(fs.createReadStream('./weapon.png'), {mode: 'moderate'}) //Pass the input file and the mode .then((result) => { console.log(result); }).catch((err) => console.log(err.toString())); //Push errors to Catalyst Logs
A sample response that you will receive is shown below. The response is the same for both versions of Node.js.
Node js
copy{"probability":{"racy":"0.09","nudity":"0.06"},"confidence":"0.85","prediction":"safe_to#_use"}
Last Updated 2023-09-03 01:06:41 +0530 +0530
Yes
No
Send your feedback to us