bert-squeeze icon indicating copy to clipboard operation
bert-squeeze copied to clipboard

🛠️ Tools for Transformers compression using PyTorch Lightning ⚡

Results 8 bert-squeeze issues
Sort by recently updated
recently updated
newest added

When performing distillation in `soft` or `hard` mode the way the datasets are concatenated is dubious. Lightning offers a handy solution to use multiple datasets (see [documentation](https://pytorch-lightning.readthedocs.io/en/stable/guides/data.html#multiple-datasets)), which will make...

enhancement
good first issue
distillation

```bash initializing ddp: GLOBAL_RANK: 1, MEMBER: 2/2 [2021-10-27 06:28:58,303][torch.distributed.distributed_c10d][INFO] - Added key: store_based_barrier_key:1 to store for rank: 1 [2021-10-27 06:28:58,316][torch.distributed.distributed_c10d][INFO] - Added key: store_based_barrier_key:1 to store for rank: 0 [2021-10-27...

bug
help wanted

This PR aims at refactoring the whole repository and document it. The main idea is to make the repository easier to use and release it as a pypi package.

refacto

Having a proper documentation and examples would help people get started with the repo.

documentation

Let's add a callback to write the model in the onnx format

enhancement

Link to the style guide [here](https://lightning.ai/docs/pytorch/stable/starter/style_guide.html).

good first issue
refacto
priority:low

Hi @JulesBelveze I trained a speech enhancement network using pytorch-lightning, and now I want to do some lightweighting on it, can I use this project? The network structure is Convolution...

question

DeeBert models need to be fine-tuned in a two step fashion: first the final layer and then the ramps. The current implementation requires the user to do two different training....

enhancement