Image Moderation

Introduction

Image Moderation is a Catalyst Zia Servies component that monitors and recognizes inappropriate and unsafe content in images. It is a sub-feature of content moderation that decides if a particular image is safe for work, based on a pre-determined set of rules. Zia Image Moderation monitors for and recognizes the following criteria in an image:

  • Explicit nudity
  • Racy or Suggestive content
  • Bloodshed, gore, and violence
  • Drugs and substances
  • Weapons

Image Moderation is used to ensure that the user-generated content in your Catalyst applications does not violate the application standards and guidelines. You can maintain your brand reputation and general decorum by flagging, filtering, or automatically deleting the detected inappropriate content.

Catalyst provides Image Moderation in the Java, Node.js and Python SDK packages, which you can integrate in your Catalyst web or Android application. The Catalyst console provides easy access to code templates for these environments that you can implement in your application’s code.

You can also test Image Moderation using sample images in the console, and obtain the moderation results based on the attributes mentioned above. Image Moderation also provides a confidence score for each result that enables you to verify its accuracy and make informed decisions on the next steps to be taken.

You can refer to the Java SDK documentation, Node.js SDK documentation and Python SDK documentation for code samples of Image Moderation. Refer to the API documentation to learn about the API available for Image Moderation.

You can learn more about the other components of Catalyst Zia Services from this page.

Last Updated 2023-08-18 18:27:19 +0530 +0530

ON THIS PAGE