Explicit user content

Is there a way to automatically (and quickly) check texts and images for explicit content?

Will I take responsibility for this if I didn’t notice and delete it in time? If the app is going to scale, it’s almost impossible to check everything that users will create.

Important to have a good terms of use in place to make it clear about what type of content you find explicit. have them agree to this beforehand.

then, make sure you’re getting them to verify their email address before they can use the website. it’s too annoying to go through lots of email address just to be a douchbag on a new platform.

have a way people can report abusive content so you’ll get notifications in your email with a link to the content and you can review. if you’re app is scaling (which is a great thing!) - you’ll have more resources to invest into content moderation (but don’t worry about it too much right now) - just try to make it easy for people to report it (small icon in the bottom left or right of the content space that shows a group that has a workflow to report content.

That’s why we made these for you :love_letter::

Detect negative sentiments form text:

Detect explicit content detection from images

Detect explicit content detection from videos

4 Likes

wow- super cool! thanks for sharing @redvivi :metal:

1 Like
Summary

Thanks for sharing!
But it takes quite some time to check images (as it is in their example). Going to be tedious for users to wait that much.

1 Like

Once executed once in Bubble’s backend from your app, the subsequent calls for different pictures are quick (around 1-2 seconds) thanks to Bubble’s intelligent caching.

Alternatively, you could also moderate the pictures at a later stage, not synchronised with the user upload.

1 Like