MiniGPT-4
MiniGPT-4 copied to clipboard
salesforce-lavis is not compatible with the latest transformer.
When I tried to run the colab I got the error in this step: """!pip install -q h5py !pip install -q typing-extensions !pip install -q wheel !pip install -q git+https://github.com/huggingface/transformers.git -U""". The error message says:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. salesforce-lavis 1.0.2 requires transformers<4.27,>=4.25.0, but you have transformers 4.29.0.dev0 which is incompatible.
If I restrict the transformers version to below 4.27, I got the error when running the model saying: ModuleNotFoundError: No module named 'transformers.models.llama'
Seems llama was included in transformer after 4.27
Hello did you solve your problem? I also encountered the same problem as you
you can install salesforce-lavis 1.0.0 to solve this problem
hey I did that, and it seems to solve the problem. However, when I run the demo script I seem to get issue stuck at loading shards where it exists without an error. Could it be related?
I am working in Colab Pro.
!python demo.py --cfg-path eval_configs/minigpt4_eval.yaml --gpu-id 0
/usr/local/lib/python3.9/dist-packages/requests/init.py:102: RequestsDependencyWarning: urllib3 (1.26.15) or chardet (5.1.0)/charset_normalizer (2.0.12) doesn't match a supported version! warnings.warn("urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported " 2023-04-23 01:44:25.093945: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT Initializing Chat Loading VIT Loading VIT Done Loading Q-Former Loading Q-Former Done Loading LLAMA
BUG REPORT Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
Loading checkpoint shards: 0% 0/2 [00:00<?, ?it/s]^C