llama_cpp_dart icon indicating copy to clipboard operation
llama_cpp_dart copied to clipboard

TimeoutException when loading Gemma 3 4B with mmproj

Open GijsWithagen opened this issue 2 months ago • 5 comments

When trying to initialize the Gemma 3 4B model (gemma-3-4b-it-Q8_0.gguf) with the mmproj file (e.g., mmproj-BF16.gguf) on a MacBook Pro 16” M1 Max (32GB RAM), the initialization times out with:

TimeoutException (TimeoutException: Operation "model loading" timed out)

Without the mmproj, the model loads fine and can process normal text prompts, but it does not process images, which is expected.

To Reproduce:

final modelParams = ModelParams()..nGpuLayers = -1;
    final contextParams = ContextParams()
      ..nPredict = -1
      ..nCtx = 4096
      ..nBatch = 1024;
    final samplerParams = SamplerParams()
      ..temp = 0.25
      ..topP = 0.90;

final loadCommand = LlamaLoad(
    path: "PATH_TO_MODEL",
    modelParams: modelParams,
    contextParams: contextParams,
    samplingParams: samplerParams,
    mmprojPath: "PATH_TO_MMPROJ_MODEL",
 );

final parent = LlamaParent(loadCommand);
await parent.init(); // <- TimeoutException occurs here

The code looks similar to the given examples, so i don't know what I'm doing wrong.

I got the models here

Expected behavior The model should load successfully with the mmproj file and be able to process image prompts.

GijsWithagen avatar Nov 12 '25 23:11 GijsWithagen

update to newest version

netdur avatar Nov 13 '25 00:11 netdur

The timeout also occurs when I use the latest version of the package (^0.1.2) and the latest build of llama ccp llama-b7046-xcframework.zip.

I run the project as Flutter project for macOS

GijsWithagen avatar Nov 13 '25 12:11 GijsWithagen

I just tested the my code inside a dart console application, like done in the examples, but with the downloaded Gemma 3 model. And then it works. So it seems a problem with running the package with Flutter.

GijsWithagen avatar Nov 14 '25 00:11 GijsWithagen

@GijsWithagen then there is high chance the llama.cpp could not access the model, make sure the model is accessible, for example, move from assets folder to phone store then provide full path to model

netdur avatar Nov 14 '25 14:11 netdur

@netdur thanks! But unfortunately that's not the problem. As the model does work on macOS when I don't provide the mmprojPath.

GijsWithagen avatar Nov 14 '25 17:11 GijsWithagen