LLMLingua icon indicating copy to clipboard operation
LLMLingua copied to clipboard

keyError 'llama' when trying to running PromptCompressor()

Open radcon00 opened this issue 1 year ago • 3 comments

here is that stack trace. I can't for the life of me figure out what the source of this error is. { "name": "KeyError", "message": "'llama'", "stack": "--------------------------------------------------------------------------- KeyError Traceback (most recent call last) Cell In[2], line 4 1 # testing prompt compression using llmlingua 2 from llmlingua import PromptCompressor ----> 4 compressor = PromptCompressor(device_map="cpu") 5 compressed_prompt = compressor.compressed_prompt(prompt= text, instructions="identify the last unanswered question the trascript", question="",tartget_token =350)

File c:\Users\conrad.liburd\Anaconda3\envs\tapas\lib\site-packages\llmlingua\prompt_compressor.py:27, in PromptCompressor.init(self, model_name, device_map, model_config, open_api_config) 20 def init( 21 self, 22 model_name: str = "NousResearch/Llama-2-7b-hf", (...) 25 open_api_config: dict = {}, 26 ): ---> 27 self.load_model(model_name, device_map, model_config) 28 self.retrieval_model = None 29 self.retrieval_model_name = None

File c:\Users\conrad.liburd\Anaconda3\envs\tapas\lib\site-packages\llmlingua\prompt_compressor.py:40, in PromptCompressor.load_model(self, model_name, device_map, model_config) 38 if "trust_remote_code" not in model_config: 39 model_config["trust_remote_code"] = trust_remote_code ---> 40 config = AutoConfig.from_pretrained( 41 model_name, trust_remote_code=trust_remote_code 42 ) 43 tokenizer = AutoTokenizer.from_pretrained( 44 model_name, trust_remote_code=trust_remote_code 45 ) 46 if model_config.get("pad_to_left", True):

File c:\Users\conrad.liburd\Anaconda3\envs\tapas\lib\site-packages\transformers\models\auto\configuration_auto.py:917, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs) 915 return config_class.from_pretrained(pretrained_model_name_or_path, **kwargs) 916 elif "model_type" in config_dict: --> 917 config_class = CONFIG_MAPPING[config_dict["model_type"]] 918 return config_class.from_dict(config_dict, **unused_kwargs) 919 else: 920 # Fallback: use pattern matching on the string. 921 # We go from longer names to shorter names to catch roberta before bert (for instance)

File c:\Users\conrad.liburd\Anaconda3\envs\tapas\lib\site-packages\transformers\models\auto\configuration_auto.py:623, in _LazyConfigMapping.getitem(self, key) 621 return self._extra_content[key] 622 if key not in self._mapping: --> 623 raise KeyError(key) 624 value = self._mapping[key] 625 module_name = model_type_to_module_name(key)

KeyError: 'llama'" }

radcon00 avatar Jan 08 '24 17:01 radcon00

Hi @radcon00, thank you for your interest in LLMLingua.

It seems there might be an issue with the transformers package. Could you please update the transformers package and try again?

iofu728 avatar Jan 09 '24 10:01 iofu728

getting the same error.

updated transformers to 4.36.2 and key error persisted.

error message:

formers/models/auto/configuration_auto.py", line 917, in from_pretrained config_to_name = { File "/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 623, in getitem ("oneformer", "OneFormer"), KeyError: 'llama'

moebius-ansa avatar Jan 13 '24 21:01 moebius-ansa

Hi @radcon00 and @moebius-ansa, this doesn't quite make sense. You can see the definition of the llama's key-value relationship at https://github.com/huggingface/transformers/blob/main/src/transformers/models/auto/configuration_auto.py#L130.

Could you check the transformers version in '/lib/python3.10/site-packages/transformers' or update it by running pip install git+https://github.com/huggingface/transformers.git?

iofu728 avatar Jan 15 '24 06:01 iofu728