FastChat icon indicating copy to clipboard operation
FastChat copied to clipboard

The conversion command cannot locate the installed transformers

Open ant3001 opened this issue 2 years ago • 2 comments

Hi, when run the conversion command for 7B to generate the diff between llama and vicuna, it gives error as below

from transformers import AutoTokenizer, AutoModelForCausalLM

ModuleNotFoundError: No module named 'transformers’

But I tried to check the transformers by pip3 show transformers, it shows the transformers are already installed. It maybe a very simple question here, is there any suggestion, thanks, been sucked here for days. Only python 3.10 is installed and no virtual env has been setup.

ant3001 avatar Apr 10 '23 03:04 ant3001

Check which environment you are using. Open a simple python prompt on that environment and see if it can locate transformers.

Alternatively, use which python to locate the version of python you are using.

sauravm8 avatar Apr 14 '23 08:04 sauravm8

Make sure you're in the right directory. Inspect closely and you will see there is nested transformers directory within the first: transformers/src/transformers. The latter is where you want to be to run the script.

dmitchell217 avatar Apr 15 '23 02:04 dmitchell217

Seems like the user's local setup issue. Closing.

zhisbug avatar May 08 '23 07:05 zhisbug