Error while using the Google collab : cannot import name 'DeepSpeedPlugin' from 'pytorch_lightning.plugins'
I tried both google collab (train from scratch or finetune with GPT-NEO) both have an issue while loading the module and gives :
cannot import name 'DeepSpeedPlugin' from 'pytorch_lightning.plugins' (on the first cell while importing aitextgen)
I tried do uninstall and reinstall libraries as mentioned here. But it did not do the trick.
Is there a workaround ?
Thanks in advance.
Having same issue
Having the same issue but locally (Ubuntu 22.04 )
Have the exact same issue on collab
This may be a compatibility issue with the new version 2.0.0 of pytorch-lightning. Haven't fully tested it anything myself, but reverting to a previous version of pytorch-lightning like version 1.9.4 may be a workaround.
pip uninstall pytorch-lightning
pip install pytorch-lightning==1.9.4
Another possibility is to install directly from master. https://github.com/minimaxir/aitextgen/commit/78a0bfdfaff917cad6430d0d12b43f58a4b3226d looks like it comments out DeepSleepPlugin, which may solve the issue.
pip install git+https://github.com/minimaxir/aitextgen
https://github.com/minimaxir/aitextgen/pull/216 also seems to provide some bug fixes. Could try:
pip install git+https://github.com/scorixear/aitextgen
I tried pip uninstall pytorch-lightning pip install pytorch-lightning==1.9.4
and also pip install git+https://github.com/minimaxir/aitextgen pip install git+https://github.com/scorixear/aitextgen
but still same error
This may be a compatibility issue with the new version 2.0.0 of
pytorch-lightning. Haven't fully tested it anything myself, but reverting to a previous version ofpytorch-lightninglike version 1.9.4 may be a workaround.pip uninstall pytorch-lightning pip install pytorch-lightning==1.9.4Another possibility is to install directly from master. 78a0bfd looks like it comments out DeepSleepPlugin, which may solve the issue.
pip install git+https://github.com/minimaxir/aitextgen#216 also seems to provide some bug fixes. Could try:
pip install git+https://github.com/scorixear/aitextgen
Thanks for your answer I tried this on the google colab:
#!pip install -q aitextgen
!pip install git+https://github.com/scorixear/aitextgen
!pip uninstall pytorch-lightning
!pip install pytorch-lightning==1.9.4
and still got an issue (another one, but still an issue):
ModuleNotFoundError: No module named 'pytorch_lightning.callbacks.progress.progress_bar'
It seems that the module aitextgen need to access a very specific version of pytorch lightning
Hi. Is there any solution to this?
Maybe it's related to 7bfbddb7169f9546bb3f66145eb7bf4ef3a50e99, a rollback to an earlier PyTorch-lightning around 1.3.1
Maybe it's related to 7bfbddb, a rollback to an earlier PyTorch-lightning around 1.3.1
Tried installing 1.3.1 instead of 1.9.4 and got
ImportError: cannot import name 'get_num_classes' from 'torchmetrics.utilities.data' (/usr/local/lib/python3.9/dist-packages/torchmetrics/utilities/data.py)
Maybe it's related to 7bfbddb, a rollback to an earlier PyTorch-lightning around 1.3.1
Tried installing 1.3.1 instead of 1.9.4 and got
`ImportError: cannot import name 'get_num_classes' from 'torchmetrics.utilities.data' (/usr/local/lib/python3.9/dist-packages/torchmetrics/utilities/data.py)
`
Did you also a version prior to that commit I mentioned?
On Ubuntu 22.04 doing pip install pytorch-lightning==1.7.0 got it working for me with the demo. Trying to use the training example gives:
pytorch_lightning.utilities.exceptions.MisconfigurationException: The provided lr scheduler LambdaLR doesn't follow PyTorch's LRScheduler API. You should override the LightningModule.lr_scheduler_step hook with your own logic if you are using a custom LR scheduler.
I've tried various versions of the touch lib + aitext and none of them work as yet.
I manage to resolve the error on the google colab by running this:
!pip install -qq pytorch-lightning==1.7.0 transformers==4.21.3 aitextgen==0.6.0
Please do let me know if it also solves your issue on colab?
I manage to resolve the error on the google colab by running this:
!pip install -qq pytorch-lightning==1.7.0 transformers==4.21.3 aitextgen==0.6.0
Please do let me know if it also solves your issue on colab?
It works fine in both colab and local (CPU) for me,
Thanks
What distro are you using?
Thanks Darrin
On Fri, 31 Mar. 2023, 23:04 Fqlox, @.***> wrote:
I manage to resolve the error on the google colab by running this:
!pip install -qq pytorch-lightning==1.7.0 transformers==4.21.3 aitextgen==0.6.0
Please do let me know if it also solves your issue on colab?
It works fine in both colab and local (CPU) for me,
Thanks
— Reply to this email directly, view it on GitHub https://github.com/minimaxir/aitextgen/issues/215#issuecomment-1491825098, or unsubscribe https://github.com/notifications/unsubscribe-auth/AADOCT2TOXZJW7XK5547RL3W63B3FANCNFSM6AAAAAAWCO5FXA . You are receiving this because you commented.Message ID: @.***>
pip install -qq pytorch-lightning==1.7.0 made it work for me on Win10 x64 Py 3.11
Thanks Veritas83
pip install -qq pytorch-lightning==1.7.0made it work for me on Win10 x64 Py 3.11
Thanks, that combo doesn't work with python 3.8.10, which is default for ubuntu 22.04, so I need to try an newer version of python.
On Ubuntu 22.04 doing pip install pytorch-lightning==1.7.0 got it working for me with the demo. Trying to use the training example gives:
pytorch_lightning.utilities.exceptions.MisconfigurationException: The provided lr scheduler
LambdaLRdoesn't follow PyTorch's LRScheduler API. You should override theLightningModule.lr_scheduler_stephook with your own logic if you are using a custom LR scheduler.
i am getting the same error while training, found any fix ?
Same issue with me (Python 3.10.8, Mac OS, pip install pytorch-lightning==1.7.0 ). Error with training
The provided lr scheduler LambdaLR doesn't follow PyTorch's LRScheduler API seems to be also discussed in issue #219
try this : #!pip install -q aitextgen
!pip install git+https://github.com/scorixear/aitextgen
#!pip uninstall pytorch-lightning !pip install pytorch-lightning==1.9.4
it'
try this : #!pip install -q aitextgen
!pip install git+https://github.com/scorixear/aitextgen
#!pip uninstall pytorch-lightning !pip install pytorch-lightning==1.9.4
it's work but I get this error: ModuleNotFoundError: No module named 'transformers.models.cpmant.configuration_cpmant'. Can you help me ? I think due to version of transformerr.
I've tried the various suggestions here, and none of them work with both generation and fine tuning.
With this setup:
python 3.11 - Windows 64bit
torch == 2.0.1
pytorch-lightning == 2.0.6
You get this error for both text generation and fine tuning:
ImportError: cannot import name 'DeepSpeedPlugin' from 'pytorch_lightning.plugins'
Downgrading pytorch-lightning to 1.9.4 gives this error:
ImportError: cannot import name '_TPU_AVAILABLE' from 'pytorch_lightning.utilities'
I've chased these errors and have had a case where text generation works (though I unfortunately lost that particular combination of packages), but fine tuning failed. I'm in need of fine tuning so this isn't good for my use case.
The requirements file suggests that a wide range of package versions will support the package, but it seems more apparent that a narrow version of packages should be recommended.
Do we have any insight into what exact package/python versions allows both text generation and fine tuning to work?
try this : #!pip install -q aitextgen
!pip install git+https://github.com/scorixear/aitextgen
#!pip uninstall pytorch-lightning !pip install pytorch-lightning==1.9.4
1.9.4 really works
I have the same problem. Tried all solutions proposed, none of them work.
i know this issue is over a year old, but i decided to find the package versions that cause this and similar import-time problems. and thus, i found this configuration:
import aitextgen tested on python 3.13.2 on windows 11 64bit with
- rust (for compiling
tokenizers)
and these python packages:
pip<24.1
aitextgen==0.6.0
pytorch-lightning<1.8.0
transformers>=4.34.0,<4.40.0
torchmetrics<1.0.0
notes:
- pip<24.1 is due to
DEPRECATION: pytorch-lightning 1.7.7 has a non-standard dependency specifier torch>=1.9.*. pip 24.1 will enforce this behaviour change.
transformers4.40.0 requirestokenizers1.19.x, which fails to build on python 3.13, due to
error: the configured Python interpreter version (3.13) is newer than PyO3's maximum supported version (3.12)
= help: please check if an updated version of PyO3 is available. Current version: 0.21.2
i haven't tested anything other than just importing the package, but at least that works
i haven't checked with scorixear's fork yet, but i will at some point