A plugin that quickly identify given image or gif if it is sensitive content or nsfw (not safe for work )
The plugin categories image probabilities in the following 5 classes:
Drawing - safe for work drawings (including anime)
Hentai- hentai and pornographic drawings
Neural - safe for work neutral images
Porn - pornographic images, sexual acts
Sexy - sexually explicit images, not pornography
Each category will have a percentage from 0% to 100% of possibility that the image is from particular category.
Check uploaded images/gifs by user before using them.
Instruction:
Keep in mind that NSFW isn’t perfect, but it’s pretty accurate (~90% from test set of 15,000 test images)
It is ok that some images can get a few percents in not relevant categories.
Best practice is to combine results. For example, if each of category hentai, porn and sexy have less than 20% then it is safe to use even if the image have let’s say 60% in drawing category and this is not drawing.
Don’t run more that 1 test at the time otherwise your processor will be overloaded and the page may freeze or other unexpected errors can appear.
Can you give the editor link of the demo for the nsfw plugin? Unfortunately the link for the nsfw editor on the Ez Code Plugins (bubbleapps.io) page takes me to the “native-share” plugin.
@eazycode the plugin stopped working. I also checked on your demo site. Can you please look into it. Anyone else experiencing problems? I am not getting any values back after images are analyzed.
@eazycode yes I can confirm that it works on Safari now, but it is not working on chrome (latest version).
Actually after trying several times in Chrome it also works sometimes. So the problem is that it is not reliably working or maybe so slow or somehow blocked. Please check- Really appreciate your effort.
For example, I would like to run a workflow that looks for the last 24 hours of uploaded images.
And to fill a report. Is it possible? And is there a limit on images?