Cut MNN build size according to operators to use
Context I am trying to run models on android devices and have a constraint on the app size increase. I am trying out all the build flags that help us in decreasing the libs size of MNN like building it only for CPU. Even after these efforts, I would want to reduce the libs size further, for it to be acceptable to my usecase.
Problem Is there a way in which we can create a MNN libs by cutting the build for specific operators? For example, if I want to run only specific set of Tree based models, I choose the operators that are used in these models, and reduce the size of MNN libs which keeps only these operators. Ideally I would want something that is currently being done by ONNX custom build
You can build MNN with opition -DMNN_BUILD_MINI=true -DMNN_SEP_BUILD=true to reduce MNN size. As the same time, you should convert MNN model with option --saveStaticModel
We want ONNX custom build in MNN. Any way to use this feature in MNN?
Marking as stale. No activity in 60 days.