llama.cpp
llama.cpp copied to clipboard
[Enhancement] Officially supported/provided python bindings
Would the project accept a PR for providing directly supported python bindings.
I know the README mentions llama-cpp-python. And there are these and perhaps others... pip install llama-cpp-python pip install llamacpp pip install pyllamacpp pip install llamacpypy
Instead, perhaps it would be better to expose at least the low level interface as python directly in this repo.
Perhaps rwkv.cpp can be used as an example. It does the following:
- allow building as shared library
- create python bindings that just expose functions in the shared library as is
- (optional) create a higher level model that that builds on the basic bindings
- examples in python, rather than bash scripts
+1
+1
I would second there would be an official one. None of above really works out of box
I looked at this a little more. I was hoping that there as a good tool for automatically generating Cython or pybind11 bindings, as maintenance of hand crafted bindings may be problematic.
There are many tools ... I did not find anything that would work with llama cpp yet. The C++ tools like RosettaCommons/binder do not seem to fit, as the llama interface is really a C interface.
Actually I was wondering if @saharNooby could comment with any lessons from rwkv.cpp.
Or if others had any good experience with tools to generate python bindings for an existing library.
Duplicate of #82, closing this as there's more discussion over there.
@dmahurin Thanks for pinging me!
I don't use any codegen tools in rwkv.cpp
, the wrapper around the native library is manually written with using ctypes
. This approach works OK, because there is not much functions in rwkv.cpp
. But even with having large API, its interface would not be frequently changed anyway, so manual approach can work too.
One lesson to share is: if you are distributing dll/so files, don't do breaking changes in C API, or else you would be spammed with "oops, I've updated the repo and nothing works!", because users would not rebuild/redownload the libraries.
Not sure if it is applicable to llama.cpp
tho, looks like you do not distribute pre-compiled libraries.
@sw Ok, we can move the discussion to issue #82.
Though the issue here is not currently part of the description (none) or discussion in that issue. The discussion for #82 seems to be mainly be about python bindings external to llama.cpp.
This issue described here was specifically about having a directly supported python binding in the project (and also replacing the bash scripts with python).
Python bindings, like rwkv.cpp has, like bert.cpp has.
I think it is the right approach and adds the ability for llama.cpp to be directly integrated into the abundant python AI projects.