llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

Add support for external call from iOS, Android, macOS and Flutter

Open paulocoutinhox opened this issue 1 year ago • 3 comments

Hi,

Im making a Flutter app that call the main function from llama.cpp, but im getting some crash. Can you add a target for CMake to i build shared libraries that have a public function that send the data to socket?

My Flutter app is working here: https://github.com/paulocoutinhox/llama-flutter

My fork with required patches is here (but i patch only to my case): https://github.com/paulocoutinhox/llama.cpp/pull/1/files

If you create a CMake target that i don't need patch and have C/C++ interfaces that let us call the function to execute the prompt, i can use on my Flutter project and use mine more robust project called XPLPC (https://github.com/xplpc/xplpc).

Thanks for the help.

paulocoutinhox avatar Apr 16 '23 01:04 paulocoutinhox

You can already compile a library to call from other languages with just a C ABI. main.cpp has a lot of stuff in there that you don't really want if you're integrating it into another program, like the terminal handling, reading from stdin, etc.

If you configure CMake with -DBUILD_SHARED_LIBS=ON then you get a .so/.dll/.dylib. This can be linked into your program. I know I have done this successfully from C#, for example. Another example is the Python wrapper: llama-cpp-python.

The only difference is that you have to use the lower level functions to generate text: tokenize, eval, sample, etc. And manage the context so it doesn't fill.

SlyEcho avatar Apr 16 '23 22:04 SlyEcho

Hi,

Yes, i see that projects. The problem in that way of use is that i need create or reimplement most things in Dart.

I want use it like the executable from terminal, where we can pass some parameters and it works, like the executable generated.

Will be nice if we have a way to call like this, because all the implementation will stay in C++ side.

paulocoutinhox avatar Apr 17 '23 03:04 paulocoutinhox

Hi,

Any update on this?

Thanks.

paulocoutinhox avatar Apr 19 '23 16:04 paulocoutinhox

yes, need iOS and Android suppor too!

realcarlos avatar May 05 '23 06:05 realcarlos

In the fork at llama-jni, we rewrite main.cpp to provide the JNI for Android applications, basically achieving to get every text output of the compiled main.cpp.

On this basis, we also implemented llama-pinyinIME, which is an IME with LLM capability.

Hope it might help.

edwardoll avatar Jun 11 '23 14:06 edwardoll

yes, need iOS and Android suppor too!

Soon I will bring the core module to a separate repository, for now you can look here, maybe you will find something useful for iOS.

guinmoon avatar Jul 04 '23 13:07 guinmoon

Probably easiest right now is the server example. Currently is running on TCP sockets, but maybe Unix sockets or named pipes is possible.

SlyEcho avatar Jul 04 '23 14:07 SlyEcho

Thanks for all replies, my objective was reached.

paulocoutinhox avatar Oct 22 '23 05:10 paulocoutinhox

Note that server currently does not support authentication, so other apps on the same device may be able to access it.

shibe2 avatar Oct 22 '23 06:10 shibe2