fluence copied to clipboard
A deep learning library based on Pytorch focussed on low resource language research and robustness
Winner of Pytorch Global Hackathon 2020.
Fluence is a Pytorch based deep learning library focussed on providing computationally efficient, low resource methods and algorithms for NLP. Although the main focus is to provide support with transformers for NLP tasks, it can be extended with other domains and architectures as well. Currently in pre-alpha stage.
List of implemented papers
- Adaptive Attention Span in Transformers (ACL 2019)
- Adaptively Sparse Transformers (EMNLP 2019)
- Reducing Transformer Depth on Demand with Structured Dropout (ICLR 2020)
Why Fluence ?
Fluence is targeted towards two main goals:
- Compute efficiency: Low resource research:
- Robustness: Algorithms that either enhance our understanding of current methods or show where SoTA methods fail.
It is as straightforward to use as HF Transformers, and fully integrates with Pytorch. Please note that the current modules (meta-trainer, siamese-trainer) which rely on inherited
Trainer works with
transformers==3.0. Newer version comes with a modified
For stable version:
pip3 install --user fluence
For development version (recommended):
git clone https://github.com/prajjwal1/fluence
python3 setup.py install --user
The library contains implementation for the following approaches (many more to come):
|Method with documentation
|Siamese Methodology, Debiasing
Please head to this link to learn how you can integrate
fluence with your workflow. Since it's an early release, there might be bugs. Please file an issue if you encounter one. Docs are a work-in-progress.
You can contribute by either filing an issue or sending a Pull Request (if you encounter any bug or want some features to be added). Please checkout the contributing guide for more details.
Fluence comes with an extensive test suite for high test coverage.
pytest tests/ -v
Author: Prajjwal Bhargava (@prajjwal_1)