CJ Pais
CJ Pais
I am working on a few projects right now, but if I get a chance I will try to get support in (assuming it doesn't already work). I would also...
fwiw moondream support was merged in #6899, haven't had a chance to look at/try internvl
There are many ways to do this. You could use utilities like `screen`, `tmux`, or run the command as normal adding an `&` at the end of the command `./llamafile...
as far as i can tell the code has most of the PR's in already (assuming it lives in `llm/ext_server/server.cpp`. it looks like the projector files used are an older...
@jart I am in progress on getting a version of whisper.cpp built with llamafile, specifically the server example The executable itself is working and seems to be compiling properly for...
Thanks @jart, that was all I needed. My C skills are a bit rusty so it was great to know I wasn't missing anything obvious, instead I was just forgetting...
it can also be done, probably fairly easy to do in the whisperfile repo. I needed `server` for a project I am doing so that was my primary focus. If...
Hey I am quite busy with a few projects, it's on my list but just not very high priority at the moment. It's really only something I can do in...
I'm sorry I don't know when I can do this, I have a huge backlog of projects I'm currently working on! I am very curious to try it but unfortunately...
Yes it can. On a RTX4080 I am able to get 4 parallel streams without issue. 4 streams is the default value for `max_clients` in the `TranscriptionServer` class. You can...