abdinal1

Results 4 comments of abdinal1

I had the same issue but I was using colab and changing the files did not affect the runtime. Now I'm running it locally and the error is gone. You...

@bonswouar Thanks for your reply. I did try starling-lm and ended up with an output. The result is as you described okayish. Do you have any special configuration for the...

Error can be reproduced with the Kaggle notebook I released easily: https://www.kaggle.com/code/aliabdin1/ollama-server/

@Biancamazzi Im running the ollama server on kaggle rescources as shown in the notebook: https://www.kaggle.com/code/aliabdin1/ollama-server Everything works fine unless my model size increases, I dont get out of memory it...