spacy-curated-transformers
spacy-curated-transformers copied to clipboard
spaCy entry points for Curated Transformers
💫 🤖 spaCy Curated Transformers
This package provides spaCy components and
architectures to use a curated set of transformer models via
curated-transformers
in
spaCy.
Features
- Use pretrained models based on one of the following architectures to
power your spaCy pipeline:
- ALBERT
- BERT
- CamemBERT
- RoBERTa
- XLM-RoBERTa
- All the nice features supported by
spacy-transformers
such as support for Hugging Face Hub, multi-task learning, the extensible config system and out-of-the-box serialization - Deep integration into spaCy, which lays the groundwork for deployment-focused features such as distillation and quantization
- Minimal dependencies
⏳ Installation
Installing the package from pip will automatically install all dependencies.
pip install spacy-curated-transformers
🚀 Quickstart
An example project is provided in the project
directory.
📖 Documentation
- 📘 Layers and Model Architectures: Power spaCy components with custom neural networks
- 📗
CuratedTransformer
: Pipeline component API reference - 📗 Transformer architectures: Architectures and registered functions
Bug reports and other issues
Please use spaCy's issue tracker to report a bug, or open a new thread on the discussion board for any other issue.