Mohammed Faheem
Mohammed Faheem
Same Issue Here. I Want to Use The Model "wojtab/llava-7b-v0-4bit-128g" using from_pretrained()
Got a Soution! Checkout AUTOGPTQ.
> @TheFaheem Sorry, may I know how to solve this problem? Check it out Here => https://github.com/PanQiWei/AutoGPTQ
This is Exactly What is Happening For Me: I'm Working On My Personal Project, This Error Happens While Using The Official Tokenizer For RWKV Model using Langchain which uses rwkv...
Does Anyone Got Solution For This. @wccccp ....
@felixvor
@balajiChundi can you elaborate a bit what you mean?
same problem help me please....
Even After Upgrading, It Shows the same error
Not the Package Problem. Kaggle NB Sucks! Just Restart the kernel and run the package update lines first. It Should be Work fine.