tract
tract copied to clipboard
Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference
Hello. I am trying to use this C API. I've built C API dll with rust 1.68.2 and DeepFilterNet-0.4.0. #include #include "deep_filter.h" #pragma comment(lib,"WS2_32") int main() { float atten_lim =...
Hello, I'm wondering if there is any mean to retrieve the [metadata ](https://onnxruntime.ai/docs/api/python/auto_examples/plot_metadata.html)associated with a ONNX model from the Tract API ?
https://github.com/Rikorose/DeepFilterNet/pull/211#issuecomment-1353637586 Digging in a bit into why I was seeing so many f32/f16 conversions despite the A55 supporting fp16 storage and arithmetic, it seems like this is just a limitation...
Hi, really excited about this project. Have been looking into using tract for object detection inference on CPU and can get around 73-75ms inference speed with onnxruntime but only around...
It would be great to take advantage of the cost_model setup for arbitrary ARM CPUs (like the a57) and generate better cost models for then, but I’m not really sure...