Mohammed Faheem

Results 22 comments of Mohammed Faheem

Same Issue Here. I Want to Use The Model "wojtab/llava-7b-v0-4bit-128g" using from_pretrained()

> @TheFaheem Sorry, may I know how to solve this problem? Check it out Here => https://github.com/PanQiWei/AutoGPTQ

This is Exactly What is Happening For Me: I'm Working On My Personal Project, This Error Happens While Using The Official Tokenizer For RWKV Model using Langchain which uses rwkv...

Does Anyone Got Solution For This. @wccccp ....

@balajiChundi can you elaborate a bit what you mean?

same problem help me please....

Not the Package Problem. Kaggle NB Sucks! Just Restart the kernel and run the package update lines first. It Should be Work fine.