Marcin Kardas
Marcin Kardas
Hi @maorgo, can you check if the problem still appears in galai version 1.1.0?
Hi all, in galai 1.1.0 we switched to transformers for checkpoints management. See the details at https://huggingface.co/docs/transformers/installation#cache-setup for information about where the cache is located and how to change it....
Hi @MaximeTut, thanks for reporting. We added a minimum transformers version to the dependencies, so it should work now.
galai 1.1.0 uses all available GPUs by default which should fix the issue. One can still manually specify the number of GPUs using `num_gpus` parameter. Setting `num_gpus=0` (or keeping the...
Hi @Naugustogi, can you check if you still experience the issues with galai version 1.1.0? You should be able to use the model on CPU with `load_model(..., num_gpus=0)`.
@Naugustogi any chance you can provide the full stack trace?
Thanks @Naugustogi. The traceback shows `galai` 1.0.0. Can you try with 1.1.2?
@Naugustogi You can install it with `pip` or clone the main git branch (currently at 1.1.2, you can verify by inspecting the [setup.py](https://github.com/paperswithcode/galai/blob/471a22d152079baa5823b7bdf18aa3ad0b47b39e/setup.py) file in your installation).
> it returns the main folder what do you mean? If you are running it as a script, you need to wrap the last line in `print()`.
Hi @wladerer, can you check galai version 1.1.0? It should work with python 3.10.