GIMP-ML-Hub icon indicating copy to clipboard operation
GIMP-ML-Hub copied to clipboard

A collection of machine learning plugins for GIMP

GIMP-ML-Hub

TravisCI

Machine Learning plugins for GIMP.

Forked from the original version to improve the user experience in several aspects:

  • The PyTorch models are packaged in PyTorch Hub format and are only downloaded as needed. This allows new models to be added more seamlessly, without needing to re-download gigabytes of model weights.
  • Models are run with Python 3, saving the needed effort to back-port them to Python 2.
  • Fully automatic installation, that has been tested on all major operating systems and distros.
  • Errors are now reported directly in the UI, rather than on the command line only.
  • Correct handling of alpha channels.
  • Automatic conversion between RGB/grayscale as needed by the models.
  • Results are always added to the same image instead of creating a new one.
  • And many other smaller improvements.

The plugins have been tested with GIMP 2.10 on the following systems:

  • macOS Catalina 10.15.5
  • Ubuntu 18.04 LTS
  • Ubuntu 20.04 LTS (apt-get only, snap is not yet supported)
  • Debian 10 (buster)
  • Arch Linux
  • Windows 10

Installation Steps

  1. Install GIMP.
  2. Clone this repository: git clone https://github.com/valgur/GIMP-ML-Hub.git
  3. On Linux and MacOS run ./install.sh.
  4. On Windows:
  5. You should now find the GIMP-ML plugins under Layers → GIMP-ML. Feel free to create an issue if they are missing for some reason.

References

MaskGAN

  • Source: https://github.com/switchablenorms/CelebAMask-HQ
  • Torch Hub fork: https://github.com/valgur/CelebAMask-HQ
  • License:
    • CC BY-NC-SA 4.0
    • Copyright (C) 2017 NVIDIA Corporation. All rights reserved.
    • Restricted to non-commercial research and educational purposes
  • C.-H. Lee, Z. Liu, L. Wu, and P. Luo, “MaskGAN: Towards Diverse and Interactive Facial Image Manipulation,” in IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2019.

Face Parsing

  • Source: https://github.com/zllrunning/face-parsing.PyTorch
  • Torch Hub fork: https://github.com/valgur/face-parsing.PyTorch
  • License: MIT
  • Based on BiSeNet:

SRResNet

DeblurGANv2

MiDaS

Monodepth2

  • Source: https://github.com/nianticlabs/monodepth2
  • Torch Hub fork: https://github.com/valgur/monodepth2
  • License:
    • See the license file for terms
    • Copyright © Niantic, Inc. 2019. Patent Pending. All rights reserved.
    • Non-commercial use only
  • C. Godard, O. Mac Aodha, M. Firman, and G. Brostow, “Digging Into Self-Supervised Monocular Depth Estimation,” in 2019 IEEE/CVF International Conference on Computer Vision (ICCV), 2019, pp. 3827–3837.

Neural Colorization

  • Source: https://github.com/zeruniverse/neural-colorization
  • Torch Hub fork: https://github.com/valgur/neural-colorization
  • License:
    • GNU GPL 3.0 for personal or research use
    • Commercial use prohibited
    • Model weights released under CC BY 4.0
  • Based on fast-neural-style:
    • https://github.com/jcjohnson/fast-neural-style
    • License:
      • Free for personal or research use
      • For commercial use please contact the authors
    • J. Johnson, A. Alahi, and L. Fei-Fei, “Perceptual Losses for Real-Time Style Transfer and Super-Resolution,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 9906 LNCS, 2016, pp. 694–711.

Authors

  • Martin Valgur (valgur) – this version
  • Kritik Soman (kritiksoman) – original GIMP-ML implementation

License

MIT

Please note that additional license terms apply for each individual model. See the references list for details. Many of the models restrict usage to non-commercial or research purposes only.