ManniX-ITA

Results 87 comments of ManniX-ITA

Oh yes sure, works perfectly!

@jakobdylanc Can confirm it's working with ollama/llava-phi3. I just had to also execute `pip install Pillow`

@jakobdylanc Yes it was an error from litellm, not installed with the upgrade. Probably optional cause maybe only needed to process images ```python 2024-05-29 16:11:37,798 ERROR: Error while streaming response...

@rick-github There's not really much in the server logs as I can see, the execution time of the API requests above is the relevant part. `$Modelname` is a bash variable...

Yes it's the q6_k quant. At the next model to quantize, will enable the debug logs to see if there's more insight.

@charsleysa thanks I missed it, that happens sometimes after computing the imatrix. but as you can see later it loads the model with CUDA without any issue. have to pick...

@dhiltgen I'm suffering the usual nvidia linux driver issues... running llama-imatrix since a while is causing the driver to start acting; get this error of no nvidia gpu found after...

@dhiltgen I will close the issue, pretty sure at this point it was the usual nvidia linux driver mess. Also don't see anymore complaints in discord, probably everyone else as...

> Or whatever else it does as well, I was confused since nothing was mentioned in the readme. You are right that was meant to be there, maybe I never...

Sorry I missed the notification Please check with the beta as said by @milkmaman