benchmarks
benchmarks copied to clipboard
Ollama
@Anindyadeep can you check if they support llama2/mistral? Otherwise let's close the issue
@Anindyadeep can you check if they support llama2/mistral? Otherwise let's close the issue
Yes we need to put this in medium priority, since it is an engine
isn't ollama practically a wrapper around llama.cpp though?