Edward Chen
Edward Chen
> > why not add the additional libraries at the time they are used? > > admittedly thought the above was going to be more work than it was so...
/azp run Linux QNN CI Pipeline,Windows ARM64 QNN CI Pipeline
I believe something similar was implemented in https://github.com/microsoft/onnxruntime/pull/26396. if additional functionality is needed, can the infrastructure from that PR be extended?
I see, I missed the update to onnxruntime/test/util/test_utils.cc. are you aware of the [debug node I/O dumping infrastructure](https://onnxruntime.ai/docs/build/inferencing.html#debugnodeinputsoutputs)? I'm wondering whether that would work, or whether it is worth having...
> Hi @edgchen1 , > > I was not aware of the existing I/O dumping infrastructure earlier, but I have now tried this functionality and confirmed that it successfully dumps...
filed an onnx bug: https://github.com/onnx/onnx/issues/7514
closing and reopening to restart CI pipelines
/azp run Windows ARM64 QNN CI Pipeline
/azp run Linux QNN CI Pipeline,Win_TRT_Minimal_CUDA_Test_CI,Windows GPU Doc Gen CI Pipeline
/azp run Linux QNN CI Pipeline,Windows ARM64 QNN CI Pipeline