Niq Dudfield
Niq Dudfield
> I'm not entirely sure I understand your comments, but I'll try. Sorry, a late reply, sent while sleepy and disappointed. I recall friends sending me quite crude simulations before...
@weshinsley Thanks for the response > weather is the word you want I think, not climate Yes and no :) The definition of climate: > the weather conditions prevailing in...
@robscovell-ts > correlation and causation Right > I recommend you read Nicholas Nassim Taleb. > He is controversial in the modelling community Cheers! I'm a programmer but not a modeler....
I note this paper has an update recently: https://papers.ssrn.com/sol3/Papers.cfm?abstract_id=3551767
Related: https://github.com/ollama/ollama/issues/2335#issuecomment-1933676859
Do you want to do anything with this?
I wonder if one could make a little node.js prototype server for this that simply wraps/proxies to multiple llama.cpp instances using mmap mode. Would be fun to see it purring...
Looks like someone made an Ollama proxy server: https://github.com/ParisNeo/ollama_proxy_server I've never tinkered much with the ollama memory settings ( so job well done Ollama team) Can it use mmap?
Seems it can use mmap. At least for the /api/generate there is a `use_mmap` option. You can start multiple instances of ollama by setting OLLAMA_HOST environment variable. e.g. `OLLAMA_HOST=0.0.0.0:11435` ollama...
Don't know if there's "anything on the table" regardless