edgeai-tidl-tools
edgeai-tidl-tools copied to clipboard
What effect does 'task_type' parameter have on the model?
Hi, I am trying to convert many onnx models to tvm_dlr type using 'examples/osrt_python/tvm_dlr/tvm_compilation_onnx_example.py' as reference, I am confused about the model_type parameter that is sent during yaml creating.
1- Is the model type parameter for doing post processing on the accelerator?
2- We have our own custom post processing for our onnx models, is there a way to convert and do inference on models without providing this model type parameter? (For example we have a yolov4-tiny based onnx model that has multiple outputs, we just want to convert it and do inference on it with tvm_dlr, no post processing)
3- Is there a generic simple cpp example for model inference only? 'examples/osrt_cpp/dlr/dlr_main.cpp' is quite simple, but we get errors when we try to load our yolov4-tiny based model, perhaps the model type parameter is causing some problems.