🍑 ᴺᴱᵂ ᴾᴸᵁᴳᴵᴺ AWS Rekognition - Detect Unsafe & NSFW Content (incl. Automated AWS Environment Setup!)

Hi Bubblers !

With this plugin, you can detect and filter explicit, suggestive, adult, pornographic, NSFW or violent content within a JPEG or PNG image file that is provided as input.

You can use this plugin in a variety of use cases such as social media, online market places, and professional media. By using Amazon Rekognition to detect unsafe content, you can reduce the need for human review of unsafe content.

The plugin returns a list of unsafe labels (name, parent category & confidence level) from an image, ranked from the most to the least probable.

You can test out our AWS Rekognition - Unsafe Content Plugin with the live demo.

Enjoy !
Made with :black_heart: by wise:able
Discover our other Artificial Intelligence-based Plugins

10 Likes

Hello Bubblers!

Just to let you know that this plugin has been updated to provide an automated script to configure your AWS environment :man_mechanic:t3:.

Enjoy!