OnnxStream
OnnxStream copied to clipboard
[question] why to convert model from ONNX to TXT format?
Hello
The models to be used by OnnxStream have to be first converted to ONNX format and then to TXT - that's as per the description. But if you are able to get the models in ONNX format, why not to use onnxruntime library as 3rd party to do the work? Is there a special reason?
hi,
the main reason is that OnnxStream allows to "stream" the parameters of a model. Therefore it allows to run very large models on devices with little RAM (however sacrificing inference speed as a result).
The alternative is not being able to run these models on these devices at all :-)
Vito