mediapipe
mediapipe copied to clipboard
How to get LLM model performance?
Hi
I would like to get the performance of Gemma model on-device(android) with medoapipe.
I read blog about llm model with mediapipe. (https://developers.googleblog.com/en/large-language-models-on-device-with-mediapipe-and-tensorflow-lite/)
How to get LLM model performance(e.g. TTFT TPOT)?
I installed LLM inference example. But I can not find any logs about performance.