text-generation-webui icon indicating copy to clipboard operation
text-generation-webui copied to clipboard

OverflowError on macOS when not using --cpu

Open cmdrf opened this issue 2 years ago • 2 comments

Describe the bug

When I attempt text generation and did not pass --cpu to server.py, I get OverflowError: out of range integral type conversion attempted.

I tried several models and always get the same error. All the models work with --cpu.

Is there an existing issue for this?

  • [X] I have searched the existing issues

Reproduction

conda create -n textgen python=3.10.9
conda activate textgen
# pytorch nightly, as suggested in other comments.
# Stable version gives "RuntimeError: MPS does not support cumsum op with int64 input" instead.
pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/cpu
pip install -r requirements.txt
python server.py

Then press "Generate" in Web UI.

Screenshot

No response

Logs

Traceback (most recent call last):
  File "/Users/fabian/devel/text-generation-webui/modules/text_generation.py", line 237, in generate_reply
    reply = decode(output[-new_tokens:])
  File "/Users/fabian/devel/text-generation-webui/modules/text_generation.py", line 48, in decode
    reply = shared.tokenizer.decode(output_ids, skip_special_tokens=True)
  File "/usr/local/Caskroom/miniconda/base/envs/textgen/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 3477, in decode
    return self._decode(
  File "/usr/local/Caskroom/miniconda/base/envs/textgen/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 549, in _decode
    text = self._tokenizer.decode(token_ids, skip_special_tokens=skip_special_tokens)
OverflowError: out of range integral type conversion attempted

System Info

- 2019 Mac Pro
- AMD Radeon Pro W5700X 16 GB
- macOS Ventura 13.3
- torch-2.1.0.dev20230401 torchaudio-2.1.0.dev20230401 torchvision-0.16.0.dev20230401

cmdrf avatar Apr 01 '23 22:04 cmdrf

I could reproduce the issue with a minimal huggingface/transformers example, so I created an issue there: https://github.com/huggingface/transformers/issues/22550. Leaving this open for tracking.

It seems to work on a MacBook with Apple Silicon btw.

cmdrf avatar Apr 04 '23 06:04 cmdrf

I could reproduce the error also on MacBook Pro 2023 running Ventura 13.3.1, M2 Pro chip.

chunhualiao avatar Apr 23 '23 17:04 chunhualiao

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.

github-actions[bot] avatar Nov 21 '23 23:11 github-actions[bot]