An ensemble of Neural Nets for Nudity Detection and Censoring
Project description
NudeNet: An ensemble of Neural Nets for Nudity Detection and Censoring
Demo of the classifier available at http://bpraneeth.com/projects/nudenet
Code used to create the demo available at https://github.com/bedapudi6788/NudeNet/issues/16#issuecomment-522936659
Uncensored version of the following image can be found at https://i.imgur.com/rga6845.jpg (NSFW)
Classification scores on the data available at https://dataturks.com/projects/Mohan/NSFW(Nudity%20Detection)%20Image%20Moderation%20Datatset
Classification Classes
nude -> image contains nudity
safe -> image doesn't contain nudity
Detection Classes
BELLY -> exposed belly (both male and female)
BUTTOCKS -> exposed buttocks (both male and female)
F_BREAST -> exposed female breast
F_GENITALIA -> exposed female genitalia
M_GENITALIA -> exposed male genitalia
M_BREAST -> exposed male breast
Installation
pip install nudenet
# or
pip install git+https://github.com/bedapudi6788/NudeNet
Classifier Usage
from nudenet import NudeClassifier
classifier = NudeClassifier()
classifier.classify('path_to_nude_image')
# {'path_to_nude_image': {'safe': 5.8822202e-08, 'unsafe': 1.0}}
Classifier now available with tfserving docker image
# Get the docker image
docker pull bedapudi6788/nudeclassifier:v1
docker run -d -p 8500:8500 bedapudi6788/nudeclassifier:v1
# Installing python client
pip install nudeclient
import nudeclient
# Single image prediction
nudeclient.predict('path_to_nude_image')
{'path_to_nude_image': {'safe': 5.8822202e-08, 'unsafe': 1.0}}
# Batch predictions
nudeclient.predict(['path_to_image_1', 'path_to_image2])
{'path_to_image_1': {'safe': 5.8822202e-08, 'unsafe': 1.0}, 'path_to_image_2': {'safe': 5.8822202e-08, 'unsafe': 1.0}}
Detector Usage
from nudenet import NudeDetector
detector = NudeDetector()
# Performing detection
detector.detect('path_to_nude_image')
# [{'box': [352, 688, 550, 858], 'score': 0.9603578, 'label': 'BELLY'}, {'box': [507, 896, 586, 1055], 'score': 0.94103414, 'label': 'F_GENITALIA'}, {'box': [221, 467, 552, 650], 'score': 0.8011624, 'label': 'F_BREAST'}, {'box': [359, 464, 543, 626], 'score': 0.6324697, 'label': 'F_BREAST'}]
# Censoring an image
detector.censor('path_to_nude_image', out_path='censored_image_path', visualize=False)
Classifier data available at https://archive.org/details/NudeNet_classifier_dataset_v1
To Do:
- Improve Documentation for the functions. (Right now user has to see the function definition to understand all the params)
- Convert these models into tflite, tfjs and create another repo that used tfjs to perform in browser detection and censor.
Note: Entire credit for collecting the object recognition dataset goes to http://www.cti-community.net/ (NSFW). The link for their api and the discord are as follows API here: http://pury.fi/ Discord: https://discord.gg/k4qM4Jh
LICENSE:
Although nudenet is licensed under GPL, if you want to use it commercially without open sourcing your code please email me or raise an issue in this repo so that I can provide you explicit written permission to use as you wish. The only reason for doing this is, it would be nice to know if some company is using my work.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for NudeNet-1.1.0-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 185a47b8a682b1dc0d74c05b911b14c1e9bade9149c8cd7b074f26d79b755fda |
|
MD5 | 6198302c5606ac4e88755238a2d2cefb |
|
BLAKE2b-256 | 2c44b177dafa4e8f71c95e8fa58ed4e6e921d7cd204f2400579a69aacab526ca |