Hi Bubblers !
With this plugin, you can detect and filter explicit, suggestive, adult, pornographic, NSFW or violent content within a JPEG or PNG image file that is provided as input.
You can use this plugin in a variety of use cases such as social media, online market places, and professional media. By using Amazon Rekognition to detect unsafe content, you can reduce the need for human review of unsafe content.
The plugin returns a list of unsafe labels (name, parent category & confidence level) from an image, ranked from the most to the least probable.
You can test out our AWS Rekognition - Unsafe Content Plugin with the live demo.
Enjoy !
Made with by wise:able
Discover our other Artificial Intelligence-based Plugins