Added Apple CoreML compilation tutorials for any SG model
CoreML Compilation Support For Any SG Model
I added a notebook for CoreML compilation, since it was requested by many of our users, and because it is a low-hanging fruit.
Convert to CoreML:
- Select any SG model (yolo-nas-s used by default), sing model.get(...)
- Convert to CoreML using the notebook - providing nn.Module and target checkpoint path for the CoreML model.
It uses torch tracing, which is recommended by apple as the go-to in terms of coreml source. Note: The netron visualization does not work on colab, it assumes the notebook is running in localhost.
To compile you SD architecture or model to CoreML:
- Run the notebook on colab
- Download the file from
/content/yolo-nas.mlmodel - Drag to XCode
- Use in your iOS/OS X application!
The names or the inputs will always be input_1, input_2, and so on, to make it easy on researchers. CoreML classes use keyword arguments for inference, so having the same input names helps a lot while working with different models and trying different things.
Pre-Reviewed notebook in colab (where I ran all cells): https://colab.research.google.com/drive/1vqa2_TJ3rgxbNaR5n--uVfpmkEkJjbrN?usp=sharing
Please note that is worked without calling prep_model_for_conversion,
We can add it if needed any maybe run regression on all the models, to see what breaks and what works.
Please note that is worked without calling
prep_model_for_conversion, We can add it if needed any maybe run regression on all the models, to see what breaks and what works.
Although it might still work without prep_model_for_conversion - we still need to call it, since this is where we fuse the repvgg branches (otherwise we will have decreased performance).
- Added text cells
- Added prep_for_conversoin
- Changed to 640x640
Great work here! 💪
Maybe it make sense to move helper functions to
super_gradients.coreml namespace?
I added the code to super_gradients.models, just like the ONNX. I also added tests for CoreML mlmodel / mlpackage.
I didn't want to abuse abstractions here, since we don't forecast many export functions as part of SG, so I re-used some of the parameters (but kept their names). Let me know if it makes sense. We can move the input preperation outside to a third function.
Squashed and signed in https://github.com/Deci-AI/super-gradients/pull/1068/commits/62936fc7d32e5330bb2bb969099118c2fcfa6d65