[ANDROID] Failed to lookup symbol 'llama_sampler_chain_default_params'
I am running a flutter app in the managed isolate fashion, but I receive this error:
E/flutter (24616): [ERROR:flutter/runtime/dart_isolate.cc(1321)] Unhandled exception:
E/flutter (24616): LlamaException: Failed to initialize Llama (Invalid argument(s): Failed to lookup symbol 'llama_sampler_chain_default_params': undefined symbol: llama_sampler_chain_default_params)
E/flutter (24616): #0 new Llama (package:llama_cpp_dart/src/llama.dart:83:7)
E/flutter (24616): #1 LlamaChild.onData (package:llama_cpp_dart/src/isolate_child.dart:24:17)
E/flutter (24616): #2 _RootZone.runUnaryGuarded (dart:async/zone.dart:1609:10)
E/flutter (24616): #3 CastStreamSubscription._onData (dart:_internal/async_cast.dart:85:11)
E/flutter (24616): #4 _RootZone.runUnaryGuarded (dart:async/zone.dart:1609:10)
E/flutter (24616): #5 _BufferingStreamSubscription._sendData (dart:async/stream_impl.dart:366:11)
E/flutter (24616): #6 _BufferingStreamSubscription._add (dart:async/stream_impl.dart:297:7)
E/flutter (24616): #7 _SyncBroadcastStreamController._sendData (dart:async/broadcast_stream_controller.dart:370:25)
E/flutter (24616): #8 _BroadcastStreamController.add (dart:async/broadcast_stream_controller.dart:244:5)
E/flutter (24616): #9 _AsBroadcastStreamController.add (dart:async/broadcast_stream_controller.dart:467:11)
E/flutter (24616): #10 _RootZone.runUnaryGuarded (dart:async/zone.dart:1609:10)
E/flutter (24616): #11 _BufferingStreamSubscription._sendData (dart:async/stream_impl.dart:366:11)
E/flutter (24616): #12 _BufferingStreamSubscription._add (dart:async/stream_impl.dart:297:7)
E/flutter (24616): #13 _SyncStreamControllerDispatch._sendData (dart:async/stream_controller.dart:777:19)
E/flutter (24616): #14 _StreamController._add (dart:async/stream_controller.dart:651:7)
E/flutter (24616): #15 _StreamController.add (dart:async/stream_controller.dart:606:5)
E/flutter (24616): #16 _RawReceivePort._handleMessage (dart:isolate-patch/isolate_patch.dart:184:12)
I have tried a cold run with a model that the maintainer suggested in another issue: https://huggingface.co/TheBloke/Tinyllama-2-1b-miniguanaco-GGUF/blob/main/tinyllama-2-1b-miniguanaco.Q3_K_L.gguf (#44). Still, no luck, I am a bit out of ideas, so I'd appreciate any help :pray:!
Snippet
Future<String> _copyModelFromAssets() async {
// application documents directory to load the gguf from assets
final directory = await getApplicationDocumentsDirectory();
// application path for gguf file (to be initialized)
final modelPath = '${directory.path}/some_model.gguf';
final modelFile = File(modelPath);
if (!await modelFile.exists()) {
final byteData = await rootBundle
.load('assets/models/tinyllama-2-1b-miniguanaco.Q3_K_L.gguf');
await modelFile.writeAsBytes(byteData.buffer
.asUint8List(byteData.offsetInBytes, byteData.lengthInBytes));
print("Model copied to documents directory.");
} else {
print("modelFile already exists!!!");
}
return modelPath;
}
Future<String> generateExplanation(String sentence) async {
String modelPath = await _copyModelFromAssets();
Completer<String> completer = Completer<String>();
String prompt =
'Explain the following sentence: $sentence';
final LlamaLoad loadCommand = LlamaLoad(
path: modelPath, // use copied local model file
modelParams: ModelParams(),
contextParams: ContextParams(),
samplingParams: SamplerParams(),
format: ChatMLFormat(),
);
final llamaParent = LlamaParent(loadCommand);
await llamaParent.init();
llamaParent.stream.listen((response) {
print(response);
if (!completer.isCompleted) {
completer.complete(response);
}
});
completer.future.then((_) {
llamaParent.dispose();
});
llamaParent.sendPrompt(prompt);
return completer.future;
}
Upon calling generateExplanation, the print statement "Model copied to documents directory." is echoed to the screen but I get the error afterwards (no response is printed either).
llama_cpp_version
I am using the latest commit:
dependencies:
llama_cpp_dart:
path: ../llama_cpp_dart
cloned to the parent directory, with the submodules initialized.
The version of the current git submodule of llama.cpp does not contain any llama_sampler_chain_default_params. The latest commit has it, but using the latest commit of llama.cpp points to some missing build files. Which commit of llama.cpp should we use?
in my llama.cp current commit hash is: 4b9afbbe9037f8a2d659097c0c7d9fce32c6494c --- 4b9afbbe
Thanks for confirming! I have just tried that commit and I get the same Failed to lookup symbol 'llama_sampler_chain_default_params' traceback.
I have no experience in real C++ projects, so my assumptions might be very wrong. I assume that flutter run compiles llama.cpp during the build process of llama_cpp_dart.
I am using it in a flutter project pointing to a local path of llama_cpp_dart in the parent directory (as stated above). Then,
cd ..
git clone https://github.com/netdur/llama_cpp_dart/
cd llama_cpp_dart/src
# instead of using the submodule, clone llama.cpp
rm -r llama.cpp
git clone https://github.com/ggerganov/llama.cpp
# and now checkout the commit and update the kompute submodule
cd llama.cpp
git checkout 4b9afbbe
git submodule update --init --recursive
# then run flutter run
cd ../../flutter-project
flutter clean && flutter run
is this okay, or should I be compiling llama.cpp manually? Thank you for your help.
oh I see, seems like library is not built at all! I haven't worked on the Android side, for now, you'll need to:
- Build llama.cpp manually first
- Place the built binary in the jni folder
- Make sure your SDK points to the correct architecture (arm64-v8a, armeabi-v7a, etc.)
I'll prioritize Android support for the next release
Okay, okay, thanks! If I manage to get it to compile for Android, I'll report it back here (although I ran into a few issues trying it before so I will probably fail).