bitsandbytes icon indicating copy to clipboard operation
bitsandbytes copied to clipboard

ImportError: Using `load_in_8bit=True` requires Accelerate: `pip install accelerate` and the latest version of bitsandbytes `pip install -i https://test.pypi.org/simple/ bitsandbytes` or pip install bitsandbytes` when in reality it's a torch issue

Open dataf3l opened this issue 2 years ago • 11 comments

bitsandbytes reports this error:

(venv) ➜  image-captioning-v2 python captionit3.py
True
False
Traceback (most recent call last):
  File "/Users/b/study/ml/image-captioning-v2/captionit3.py", line 14, in <module>
    model = Blip2ForConditionalGeneration.from_pretrained("Salesforce/blip2-opt-6.7b-coco", device_map='auto', quantization_config=nf4_config)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/b/study/ml/image-captioning-v2/venv/lib/python3.11/site-packages/transformers/modeling_utils.py", line 2616, in from_pretrained
    raise ImportError(
ImportError: Using `load_in_8bit=True` requires Accelerate: `pip install accelerate` and the latest version of bitsandbytes `pip install -i https://test.pypi.org/simple/ bitsandbytes` or pip install bitsandbytes`

however, the error is innacurate, because the issue is that the function:

def is_bitsandbytes_available():
    if not is_torch_available():
        return False

    # bitsandbytes throws an error if cuda is not available
    # let's avoid that by adding a simple check
    import torch

    return _bitsandbytes_available and torch.cuda.is_available()

and if somebody accidentally uninstall torch, this happens. So maybe one should improve the error message. and maybe sending a message here to the user complaining about "unable to import torch" would be useful, who knows

tell your friends! :)

dataf3l avatar Oct 22 '23 06:10 dataf3l

Are you on MacOS? Had the same issue on it, swapped to windows (remote ssh) and searching for a different issue lol

Its3rr0rsWRLD avatar Oct 29 '23 03:10 Its3rr0rsWRLD

I got the same ERROR

oushu1zhangxiangxuan1 avatar Oct 30 '23 02:10 oushu1zhangxiangxuan1

Are you on MacOS? Had the same issue on it, swapped to windows (remote ssh) and searching for a different issue lol

Yes. Same issue on MacOS

SoyGema avatar Oct 30 '23 13:10 SoyGema

same issue... is there any updates?

effortprogrammer avatar Nov 20 '23 05:11 effortprogrammer

same issue

pechaut78 avatar Nov 27 '23 16:11 pechaut78

Same issue

RamsesCamas avatar Dec 07 '23 14:12 RamsesCamas

Well as said above, the error is not that the lib is not properly installed: the error message is misleading. The issue is that it is not implemented on Apple Silicon (mps) So, bitsandbytes can not be used and code should be adapted ! sigh

Please see:

https://github.com/TimDettmers/bitsandbytes/issues/485

pechaut78 avatar Dec 07 '23 14:12 pechaut78

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

github-actions[bot] avatar Dec 31 '23 15:12 github-actions[bot]

This is a great catch. Can you please submit this to the transformers github repo? This is only indirectly a bitsandbytes issue.

TimDettmers avatar Jan 08 '24 15:01 TimDettmers

To me it's not entirely clear where Mac comes into play and how we would best warn that Mac is not supported.

@pechaut78 how did you deduce that it must be Mac related? And why does the code get triggered that is throwing the traceback?

Titus-von-Koeller avatar Jan 26 '24 21:01 Titus-von-Koeller

Hi - the core issue is that currently in transformers is_bitsandbytes_available() silently returns False if you don't have a CUDA device, i.e. if torch.cuda.is_available(): https://github.com/huggingface/transformers/blob/cd2eb8cb2b40482ae432d97e65c5e2fa952a4f8f/src/transformers/utils/import_utils.py#L623 This is not ideal as we should display a more informative warning instead - @Titus-von-Koeller would be happy to open a quick PR on transformers to add a logger.info if torch.cuda.is_available() is False to clearly state to users that is_bitsandbytes_available() will silently be set to False ? Otherwise happy to do it as well

younesbelkada avatar Jan 29 '24 23:01 younesbelkada