Jake Luciani
Jake Luciani
Hi @ved-asole can you give some more details like what you did to run this? did you just checkout the code and `run-cli.sh`?
Thanks, I recently added support for multiple jvms which messed this up. I'll fix. In the meantime use the code from the latest release tag `git checkout v0.2.1` and it...
You need to build the native extensions requires make and gcc. I realize this is all a PITA. I'm going to add release artifacts for all the platforms when I...
The easiest way to "just run a model" is with langchain4j integrations. Try this: https://github.com/langchain4j/langchain4j-examples/tree/main/jlama-examples
Yes, it uses the maven artifacts which are pre-built
Which model are you loading and what kind of cpu are you loading it on?
Looking at the line it seems to be a non-arm AVX which I can support but currently needs a code change. I can fix that.
It works with the distilled models. Not actual 70b deepseek r1 model. But working on adding it.
Hey, thanks! Yeah I need to add a `DEVELOPER_GUIDE.md` file that explains this. The `Dockerfile` shows the steps involved, mainly having `~/.m2/toolchains.xml` setup for mult-release builds
Hi, it is pretty cross platform as is Windows/Linux/Mac. I'm guessing mobile could work for android. Not sure how ios would work?