Koan-Sin Tan
Koan-Sin Tan
for the ai-edge-torch example, build with ``` $ bazel build -c opt \ --config android_arm64 --cxxopt=-std=c++17 \ //ai_edge_torch/generative/examples/cpp:text_generator_main ``` on Mac-mini, and then put the binary and models to /data/local/tmp/...
> [@freedomtan](https://github.com/freedomtan) How did you manage to get the weight cache? I believe my device was running out of memory while attempting to generate it. Nothing special. I tested it...
@anhappdev please help @farook-edev to run test either on firebase or browserstack.
> Thanks [@freedomtan](https://github.com/freedomtan) [@anhappdev](https://github.com/anhappdev), I managed to get the example to compile for the x86-64 emulator, I needed to add the following config to bazelrc: > > ``` > build:android_x86_64...
> The next step I assume is to build a pipeline for LLM based on this example. Could you please confirm? [@freedomtan](https://github.com/freedomtan), Alternatively, I could help in testing the different...
4.1 seeds are in https://github.com/mlcommons/inference/pull/1736
@anhappdev please add [email protected] for internal testing
Let's see if we can 1. aab (in the CI) -> apk 2. try to get technical support from Play Store? 3. @freedomtan to try on MTK devices (or Samsung...
Works on - MTK devices: both newer .dla and older TFLite Neuron Delegate work - Pixel 8 Pro: works @AhmedTElthakeb and @Mostelk: please check if the Play Store apk works...
@AhmedTElthakeb will try to run the in Play Store one w/ SELinux disabled (if that works, we are almost sure that there is some kind of permission setting problem).