Hi Bubblers !
With this plugin, you can detect and filter explicit, suggestive, adult, pornographic, NSFW or violent content within a within a MPEG-4 or MOV video, encoded using the H. 264 codec and stored in AWS S3, optionally making good use of the provided actions to Put, Get, and Delete a file from AWS S3.
You can use this plugin in a variety of use cases such as social media, online market places, and professional media. By using Amazon Rekognition to detect unsafe content, you can reduce the need for human review of unsafe content.
Amazon Rekognition uses a two-level hierarchical taxonomy to label categories of unsafe content. Each top-level category has a number of second-level categories, more information can be found here: https://docs.aws.amazon.com/rekognition/latest/dg/moderation.html.
The plugin returns a list of unsafe Moderation Labels. For each, it returns the name, parent category, confidence level, and the timestamp.
You can test out our AWS Rekognition Video - NSFW Content with the live demo here.
Enjoy !
Made with by wise:able
Discover our other Artificial Intelligence-based Plugins