text-generation-webui
text-generation-webui copied to clipboard
module 'torch' has no attribute 'LongTensor'
Describe the bug
I had gotten a few errors before which were solved after searching, then the next error that happened was this. I tried to install/re-install with this "conda install pytorch torchvision cudatoolkit=10.2 -c pytorch" . After that, I tried launching again and it still came up with the same error. Thought possible it had to do with the version of torch, but wasn't sure what version if that is the case.
Is there an existing issue for this?
- [X] I have searched the existing issues
Reproduction
Launch "start-webui"
Wait a couple seconds
module 'torch' has no attribute 'LongTensor' Press any key to continue . . .
Screenshot
Logs
Starting the web UI...
Warning: --cai-chat is deprecated. Use --chat instead.
Traceback (most recent call last):
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\installer_files\env\lib\site-packages\transformers\utils\import_utils.py", line 1125, in _get_module
return importlib.import_module("." + module_name, self.__name__)
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\installer_files\env\lib\importlib\__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 883, in exec_module
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\installer_files\env\lib\site-packages\transformers\generation\stopping_criteria.py", line 33, in <module>
class StoppingCriteria(ABC):
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\installer_files\env\lib\site-packages\transformers\generation\stopping_criteria.py", line 37, in StoppingCriteria
def __call__(self, input_ids: torch.LongTensor, scores: torch.FloatTensor, **kwargs) -> bool:
AttributeError: module 'torch' has no attribute 'LongTensor'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\text-generation-webui\server.py", line 18, in <module>
from modules import api, chat, shared, training, ui
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\text-generation-webui\modules\api.py", line 6, in <module>
from modules.text_generation import generate_reply
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\text-generation-webui\modules\text_generation.py", line 10, in <module>
from modules.callbacks import (Iteratorize, Stream,
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\text-generation-webui\modules\callbacks.py", line 13, in <module>
class _SentinelTokenStoppingCriteria(transformers.StoppingCriteria):
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\installer_files\env\lib\site-packages\transformers\utils\import_utils.py", line 1116, in __getattr__
value = getattr(module, name)
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\installer_files\env\lib\site-packages\transformers\utils\import_utils.py", line 1115, in __getattr__
module = self._get_module(self._class_to_module[name])
File "C:\Users\presi\Desktop\GTP\one-click-installers-oobabooga-windows\one-click-installers-oobabooga-windows\installer_files\env\lib\site-packages\transformers\utils\import_utils.py", line 1127, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.generation.stopping_criteria because of the following error (look up to see its traceback):
module 'torch' has no attribute 'LongTensor'
Press any key to continue . . .
System Info
Windows 11
Nvidia GTX 1050
Intel(R) i7-8650U
(I'm using a Microsoft Surface Book 2)
what model are you trying to run? what bit mode are you using? (flags)
what model are you trying to run? what bit mode are you using? (flags)
I want to get GPT4 x Alpaca working.
Windows 64 bit? (I don't know too much about coding/programming)
what do you mean here: ? conda install pytorch torchvision cudatoolkit=10.2 -c pytorch
what do you mean here: ? conda install pytorch torchvision cudatoolkit=10.2 -c pytorch
Line of code I found to install the latest version that supports GPU, through anaconda prompt, I found while trying to research a solution (Again, I know very little about this so I've probably stuffed up somewhere)
EDIT: Successfully working now (28-APR) off fresh linux mint install. Sadly, don't have any particular clue what cured this.
ORIGINAL:
I get a nearly identical error (no LongTensors) in linux using facebook opt-1.3
First time trying the installation of the webui, it has never run successfully
Linux Mint 21 Intel Celeron G3930 Nvidia 1080 Ti 16 GB RAM
Error copy&paste below:
python3 server.py
Traceback (most recent call last):
File "/home/andrew/.local/lib/python3.10/site-packages/transformers/utils/import_utils.py", line 1125, in _get_module
return importlib.import_module("." + module_name, self.name)
File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/andrew/text-generation-webui/server.py", line 18, in
I'm having the same problem. Running it on WSL on Windows 10
We are facing the same problem, on an Ubuntu 18.04.
Having the same issue with Ubuntu 18.04
I just want to install WizardLM. For this I used Oobabooga TextGen WebUI One One-click installer for windows and the Hugginface Model pygmalion-6b_dev-4bit-128g on windows 11. But I get a nearly identical error when I try to start the webui:
(Copied and pasted)
Gradio HTTP request redirected to localhost :)
Traceback (most recent call last):
File "C:\Users\jonas\Downloads\oobabooga_windows\installer_files\env\lib\site-packages\transformers\utils\import_utils.py", line 1146, in get_module
return importlib.import_module("." + module_name, self.name)
File "C:\Users\jonas\Downloads\oobabooga_windows\installer_files\env\lib\importlib_init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "C:\Users\jonas\Downloads\oobabooga_windows\text-generation-webui\server.py", line 44, in
Done!
I am getting the same error on Ubuntu 22.04 with H100 GPU
This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.