Nikita Rubinkovskiy
Nikita Rubinkovskiy
I have the same issue using docker image 0.7.1-cpu. Any news?
I found a solution: metrics are available if you make an inference using models (at least one of the models), not a workflow. E.g. `curl http://127.0.0.1:8080/predictions/dog_breed_wf__dog_breed_classification -T path_to_image/img.jpg` and `curl...
Is there still no solution?
@AlekseySh Here is the example https://github.com/Fissium/metric-learning/blob/main/examples/vit_to_onnx.py If that's what we need, I'll add it to the documentation.
@AlekseySh Yes, I'll check if this approach works for other architectures. As for the format, it depends on the number of such examples that aren't directly related to oml. But...
@AlekseySh - Experiment [link](https://colab.research.google.com/drive/1sjmHBmwAHY5Qq8pxD5QA2UdVTKmr1Aaz?usp=sharing) **Results** |Extractor|Arch|Export Support|Error| |---|---|---|---| |ViTExtractor|vits8|Yes|None| |ViTExtractor|vits16|Yes|None| |ViTExtractor|vitb8|Yes|None| |ViTExtractor|vitb16|Yes|None| |ViTExtractor|vits14|Yes|None| |ViTExtractor|vitb14|Yes|None| |ViTExtractor|vitl14|Yes|None| |ViTExtractor|vits14_reg|No|Exporting the operator 'aten::_upsample_bicubic2d_aa' to ONNX opset version 17 is not supported| |ViTExtractor|vitb14_reg|No|Exporting the operator...
@AlekseySh You're right. OML has nothing to do with ONNX. What if we create an additional section in the FAQ and just mention this issue there? So we can put...
Hi. @VSXV You can also take a look at my example https://github.com/Fissium/metric-learning/blob/main/train_bert.py