Konstantinos Angeloulis
Konstantinos Angeloulis
Hello I inferenced the model Faster R-CNN from model Zoo ONNX and an error occurs during the `caclulate_macs`: ``` raise ValueError( ValueError: This ORT build has ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider'] enabled....
I evaluated all of the classifications models according to their preprocessing description with imagenet: Models: ----------- - squeezenet1.0-12.onnx - bvlcalexnet-12.onnx - caffenet-12.onnx - rcnn-ilsvrc13-9.onnx - inception-v1-12.onnx - inception-v2-9.onnx - zfnet512-12.onnx...
# Bug Report ### Which model does this pertain to? Model faster R-CNN Opset 12 ### Describe the bug I am doing profiling with faster RCNN and calculating the Throughput...
Hello everyone, I took some measurements today during the inference from a Neural Network model from a two-socket server processor. The server has two AMD sockets and during the measurement...
I inferenced the model with both CPU and GPU, but it seems the only difference between the is just 4 ms. With CPU reaches almost 26ms and with CUDA 21...
## Bug Hello I am trying to preview just a simple `welcome to Avalonia` project but it seems that preview button doesn't work. ## To Reproduce Steps to reproduce the...