Hyeongseok Oh
Hyeongseok Oh
>I thought there is additional parameter to specify a model from multiple models, but after reading our API, I found that we don't have API that takes an index to...
>Then, what model will be executed after loading multiple models if they have no conn info? I didn't think that user need to select inference on model level. I thought...
One remain item is `Support connection information packaging`. I'll close this issue, and remain item will be handled on different issue.
>how about going to 2.4 ? Good. One concern is that tizen is using TF 2.3
@jyoungyun If there is no respond, I'll merge this PR.
This is not scheduled issue yet
Example nnpackage [three_tflites.tar.gz](https://github.com/Samsung/ONE/files/9762251/three_tflites.tar.gz) metadata ``` { "major-version" : "1", "minor-version" : "3", "patch-version" : "0", "configs" : [ ], "models" : [ "mv1.0.tflite", "mv1.1.tflite", "mv1.2.tflite" ], "model-types" : [ "tflite",...
More test case [three_tflites2.tar.gz](https://github.com/Samsung/ONE/files/9796564/three_tflites2.tar.gz) metadata ``` "major-version" : "1", "minor-version" : "3", "patch-version" : "0", "configs" : [ ], "models" : [ "inception_v3.14to22.tflite", "inception_v3.23to32.tflite", "inception_v3.33to38.tflite" ], "model-types" : [ "tflite",...
> many duplicated logic and code Then we can introduce new option in `nnpackage_run` (ex. `--modelfile`) to load supported model file type. (and rename it if need) > I think...