Luca Fulchir
Luca Fulchir
`pub fn get_user_groups` seems to have the same problem: ``` let res = unsafe { libc::getgrouplist(name.as_ptr(), gid, buff.as_mut_ptr(), &mut count) }; ``` since `count` is a value-result ans is not...
related to ticket #44
updated to 0.27.1, tried different models thanks to more ram, the local embedding is still marked as 'unreachable', same errors
workaround: use http for embeddings, not `local` I literally copied the `llama-server` cmdline and ran llama manually. connecting this way works ``` [model.embedding.http] kind = "llama.cpp/embedding" model_name = "Nomic-Embed-Text" api_endpoint...
tabby is configured to use nixos llama.cpp, built to use vulkan. Currently seems to be release `b4154` Now I notice that when I run llama manually instead it uses realase...