mediapipe
mediapipe copied to clipboard
cpu model not supported?
llm_inference
on local is throwing the below error for gemma-2b-it-cpu-int8.bin
only gpu
backend type is supported?
calculator_graph.cc:892] INVALID_ARGUMENT: CalculatorGraph::Run() failed:
Calculator::Open() for node "LlmGpuCalculator" failed: Please use a tensorflow lite model with gpu backend type, while the current model's backend type is: cpu
=== Source Location Trace: ===
third_party/odml/infra/genai/inference/calculators/llm_gpu_calculator.cc:279
third_party/mediapipe/framework/calculator_node.cc:560
_emscripten_errn @ genai_wasm_internal.js:9
$func8282 @ genai_wasm_internal.wasm:0x86abdc
$func4251 @ genai_wasm_internal.wasm:0x509788
$func2510 @ genai_wasm_internal.wasm:0x2780ff
$func200 @ genai_wasm_internal.wasm:0x95eb
$func2446 @ genai_wasm_internal.wasm:0x26c227
$qd @ genai_wasm_internal.wasm:0x596bad
ret.<computed> @ genai_wasm_internal.js:9
Module._waitUntilIdle @ genai_wasm_internal.js:9
finishProcessing @ tasks-genai:7
finishProcessing @ tasks-genai:7
Wr @ tasks-genai:7
(anonymous) @ tasks-genai:7