Fariz
Fariz
This command for export transformers to onnx model !optimum-cli export onnx --model /content/bert-base-indonesian-1.5G-sentiment-analysis-smsa bert-sentiment-onnx-fp16-opset/ --opset 13 --task text-classification --optimize 'O1' --device 'cuda' --fp16
Oh really, but I was able to do yolov8 inference using the onnx model with the f16 option to reduce the model size in the deepsparse pipeline. Is that not...
I have already Cargo packages but still erorr ``` (base) fariz@fariz:~/Desktop/spotify-adblock$ sudo make install # cargo build --profile release cargo build --release make: cargo: Command not found make: *** [Makefile:14:...