Antonio Di Monaco

Results 15 comments of Antonio Di Monaco

I confirm that it's not working on Apple M1, even with version 0.12.10. I'm using it right now, same model, same inputs, and it's superslow in the latest version, unfortunately.

[server.log](https://github.com/user-attachments/files/23418640/server.log) My server.log, in case it can help.

@rick-github there has never been any need of this before. Has anything changed? The only thing I've always done was just the Ollama upgrade. The model has never changed, it...

Could it be that before ollama was not taking into account those parameters, at least in this context?

I'm calling the model using LangChain JS, but the issue is occurring even with the plain Ollama UI interface. I'll test the parameter you mentioned. Thanks!

As an additional test, I went back to the 0.12.5, as @rwellinger reported, and now it's fast again! Even if I get on the command line ollama show --modelfile jobautomation/OpenEuroLLM-Italian:latest...

[server.log](https://github.com/user-attachments/files/23426018/server.log) That's the server.log of the working version. Thanks!

0.12.8 works also fine. Considering that those are the changes, maybe this could help to spot it: https://github.com/ollama/ollama/compare/v0.12.8...v0.12.9

Ok, sorry, I was misled by the issue title. I confirm that v0.12.9 still works, it's version v0.12.10 that is broken. So it must be one of those: https://github.com/ollama/ollama/compare/v0.12.9...v0.12.10

Could it be https://github.com/ollama/ollama/commit/6aa72830763cf694da998f5305de89701c75cea0 ?? @dhiltgen