Image Moderation

Image Moderation detects and recognizes inappropriate and unsafe content in images. The criteria include suggestive or explicit racy content, nudity, violence, gore, bloodshed, and the presence of weapons and drugs.

You can learn more from the Image Moderation help page.

You can provide a .jpg/.jpeg or .png file as the input. Refer to the API documentation for the request and response formats.

You can set the moderation mode as BASIC, MODERATE, or ADVANCED optionally. The image is processed in the ADVANCED mode by default.

The response returns the probability of each criteria with their confidence scores, and the prediction of the image being safe_to_use or unsafe_to_use.

Ensure the following packages are imported:

    
copy
import com.zc.component.ml.ZCAnalyseMode; import com.zc.component.ml.ZCImageModerateData; import com.zc.component.ml.ZCImageModerationConfidence; import com.zc.component.ml.ZCImageModerationOptions; import com.zc.component.ml.ZCImageModerationPrediction; import com.zc.component.ml.ZCML; import java.io.File;
    
copy
File file = new File("{filePath}"); //Specify the file path ZCImageModerationOptions options = ZCImageModerationOptions.getInstance().setAnalyseMode(ZCAnalyseMode.ADVANCED); //Set the moderation mode ZCImageModerateData imData = ZCML.getInstance().moderateImage(file, options); //Call moderateImage() with the input file and options ZCImageModerationPrediction prediction = imData.getPrediction(); //To get the final prediction Double predictionConfidence = imData.getConfidence(); //To get the confidence score of the final prediction List confidences = imData.getImageModerationConfidenceList(); //To get the confidence scores of each criteria predicted

Last Updated 2023-09-03 01:06:41 +0530 +0530

ON THIS PAGE

RELATED LINKS

Image-Moderation - API