Syed Muhammad Ali fatmi

Results 3 comments of Syed Muhammad Ali fatmi

i also got this error but following commands worked for me. #Install CUDA libraries using: pip install ctransformers[cuda] run above command and run your project code.

you can use this code to increase maximun context length for your llm config = {'max_new_tokens': 256, 'repetition_penalty': 1.1,'context_length':1000} llm = CTransformers(model='marella/gpt-2-ggml', config=config) for more infromation you can check below...

i am facing the same error did you sort this problem