FastChat
FastChat copied to clipboard
The conversion command cannot locate the installed transformers
Hi, when run the conversion command for 7B to generate the diff between llama and vicuna, it gives error as below
from transformers import AutoTokenizer, AutoModelForCausalLM
ModuleNotFoundError: No module named 'transformers’
But I tried to check the transformers by pip3 show transformers, it shows the transformers are already installed. It maybe a very simple question here, is there any suggestion, thanks, been sucked here for days. Only python 3.10 is installed and no virtual env has been setup.
Check which environment you are using. Open a simple python prompt on that environment and see if it can locate transformers.
Alternatively, use
which python to locate the version of python you are using.
Make sure you're in the right directory. Inspect closely and you will see there is nested transformers directory within the first: transformers/src/transformers. The latter is where you want to be to run the script.
Seems like the user's local setup issue. Closing.