Hello everyone,
We just started using the Google Safe Search Vision API to provide image upload moderation on our platform.
At first, we moderated images flagged as "adult", "medical" and "racy".
Then, we realized "Racy" is a little too strict, it easily rejects images with fully clothed cleavage (perhaps because it was a drawing maybe, I don't know). So we started allowing "racy".
I want to know what category "child porn" and "inappropriately dressed children" images are flagged as. We had one instance where a user uploaded child porn images that could have been interpreted as a scantily clad pre-teen/baby. Will images like that be flagged as "adult" or "racy"?
I want to make sure that by denying images moderated as "adult" (VERY_LIKELY), that we will be able to detect & filter out images that are:
1. Child porn that includes sex
2. Child porn that includes no sex but inappropriately dressed children
3. Nude photos of children in any pose (even if it's something like a baby being powdered or diapered)
4. Clothed children in inappropriate poses
Furthermore, I'm curious if there's an option to detect "offensive" images such as violent vomiting or literal toilet/diarrhea photos, etcetera. Which of the existing labels would detect those kind of images (if any)? If those images are not detectable with Cloud Vision Safe Search API, what are our best options?
Thank you for your help, as this is very important to us.
I look forward to your reply.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |