Tung D. Le
Tung D. Le
Addressed by #1685
@jenkins-droid test this please
@kernhanda thank you for the patch! could you please look at why lit tests failed? You can check it locally by using `make check-onnx-lit`. Thanks!
@jenkins-droid test this please
I see a big advantage of specifying a custom entry point name: users have two (single entry) onnx models and they want to run them in the same Python/C program....
> Do you think it's necessary to make the change so that if a MLIR file has a single entry point named something other than main_graph that it should still...
@kernhanda any update on this?
As I understand the propose here is to integrate `ONNX->Torch` into onnx-mlir only and does not utilize onnx-mlir further e.g. to LLVM level. Also this `erase-onnx-entry-point` indicates onnx-mlir driver is...
> The idea is that ONNX->Torch could save effort for the onnx-mlir community, by providing ONNX->Torch->{Linalg,MHLO,TOSA} which gives Linalg, MHLO, and TOSA "for free" (and eventually TCP, StableHLO, etc.) Just...
@Connor-XY @yaochengji Could you please chime in? Would like to hear your comments about `ONNX->Torch->MHLO` vs `ONNX-MHLO`. Thanks!