Mohammed Faheem

Results 8 comments of Mohammed Faheem

Same Issue Here. I Want to Use The Model "wojtab/llava-7b-v0-4bit-128g" using from_pretrained()

> @TheFaheem Sorry, may I know how to solve this problem? Check it out Here => https://github.com/PanQiWei/AutoGPTQ

This is Exactly What is Happening For Me: I'm Working On My Personal Project, This Error Happens While Using The Official Tokenizer For RWKV Model using Langchain which uses rwkv...

Does Anyone Got Solution For This. @wccccp ....

@balajiChundi can you elaborate a bit what you mean?

same problem help me please....