nsfw-classifier icon indicating copy to clipboard operation
nsfw-classifier copied to clipboard

This repository is dedicated for building a classifier to detect NSFW Images & Videos.

GitHub release

GitHub last commit GitHub issues GitHub pull requests GitHub forks GitHub stars Contributors ESLint LICENSE GitHub tweet


This repository is dedicated for building a classifier to detect NSFW Images & Videos.

Table of Contents

  • Installation
  • Usage
  • Development
  • License

Installation

(Back to Top)

To use this project, first clone the repo on your device using the command given below:

git init

git clone https://github.com/LaxmanSinghTomar/nsfw-classifier.git

Usage

(Back to Top)

Install the required libraries & packages using:

pip install requirements.txt

To download the dataset upon which the model was trained run:

python src/scripts/data.sh

If run successfully, this should create a directory data in the project directory.

To run a quick demo using an image and a video run:

python src/scripts/inference.sh

To identify whether an image contains NSFW content or not using the default model run:

python src/inference/inference_image.py [img-path]

To identify whether a video is NSFW or not using the default model run:

python src/inference/inference_video.py [video-path]

Output Video is saved in the output directory.

Note: The default trained model is MobileNetv2 which is smaller in size due to which loads quickly and is good for inference.

Development

(Back to Top)

.
├── LICENSE
├── models                         

If you wish to change the default model for predictions i.e. MobileNetv2, change MODEL_PATH in src/config.py to the either of the models available in models directory.

License

(Back to top)

GNU General Public License version 3