Fix "Field 'context' has not been initialized."
The real trigger is that the binary dependency cannot be found, but since in the cleanup is a variable not initialized the error message is not visible.
Fixes #51
sorry I just noticed this, I will review
@rekire not super convinced about using the ! here. I made a branch PR53 with a different approach - mind giving it a try?
https://github.com/netdur/llama_cpp_dart/tree/PR53
@rekire thank you for helping understand the issue
I'll check it when I find some free time
I checked out the branch and executed my "smoke test" from #57 the output is still:
Using libraryPath: null found: false Current path: /Users/rekire/dev/llama_cpp_dart_fork package:llama_cpp_dart/src/llama.dart Llama.context package:llama_cpp_dart/src/llama.dart 363:9 Llama.dispose package:llama_cpp_dart/src/llama.dart 83:7 new Llama test/llama_cpp_dart_test.dart 13:19 main.
LateInitializationError: Field 'context' has not been initialized.
So you change does not fix the issue. I understand that you don't like the bang operator however I personally prefer the nullable way and it should crash hard when the field is null. When the library is not found it would crash otherwise as we see here with the LateInitializationError.
I merged your main (aka upstream). Now I get a differnt error:
dart:ffi DynamicLibrary.lookup package:llama_cpp_dart/src/llama_cpp.dart 10856:55 llama_cpp._llama_backend_freePtr package:llama_cpp_dart/src/llama_cpp.dart llama_cpp._llama_backend_freePtr package:llama_cpp_dart/src/llama_cpp.dart 10858:7 llama_cpp._llama_backend_free package:llama_cpp_dart/src/llama_cpp.dart llama_cpp._llama_backend_free package:llama_cpp_dart/src/llama_cpp.dart 10852:12 llama_cpp.llama_backend_free package:llama_cpp_dart/src/llama.dart 367:11 Llama.dispose package:llama_cpp_dart/src/llama.dart 83:7 new Llama test/llama_cpp_dart_test.dart 13:19 main.
Invalid argument(s): Failed to lookup symbol 'llama_backend_free': dlsym(RTLD_DEFAULT, llama_backend_free): symbol not found
I am not sure which is the most elegant way to catch this. This happens in the cleanup. The fix might could look like this:
if (_status != LlamaStatus.error) {
lib.llama_backend_free();
}
However I am not sure if this is the best idea.