huangtianhua

Results 8 comments of huangtianhua

@jcwchen, Thanks for your reply. I agree that we don't really require the models are runnable in every execution provider, but if I am a user I wonder whether the...

Hi @jcwchen, > IIRC, in ORT's roadmap, they are planning to add a nightly or weekly CI in their repo to verify the latest main branch in ORT with all...

@jcwchen, OK, got it, thanks, I will ask for this in onnxruntime community.

@jcwchen, Hi, Sorry to distrub you, the issue and disscussion I proposed for a long time in onnxruntime community, but seems no one is interested with that, so maybe you...

@jcwchen Thanks for your effort on this. It's good that the models in ONNX Model Zoo works by CPU provider, but user want to know which execution providers other than...

@jcwchen OK, what you said make sense to me. > Another update: I talked to ORT core team and they are working on having a regular CI testing ONNX Model...

How to do this? Now I am working on adapting ohpc on openEuler system, https://build.openeuler.org/project/show/home:huangtianhua:ohpc you can see package 'intel-compilers-devel' is unresolvable, and if I delete the line "Requires(pre): intel-oneapi-compiler-dpcpp-cpp-and-cpp-classic...