Bowen Bao
Bowen Bao
@ezyang the Python side idea is appealing for its simplicity. We are considering something like this for the attribute side especially it they are scalars or python numbers. It is...
@ezyang the main goal is to be able to preserve the inputs/outputs order when grouping nodes belong to a certain nn.Module as custom onnx function. The current onnx pass is...
> Hmm... what if we add some `__torch_function__` style functionality so that you can interpose on the nn.Module call at the time you are tracing, and immediately do the `custom_domain`...
@ezyang That's a valid point, I guess we can explore not changing TorchScript IR, and try to manage things in Python. Instead of then `push_scope`/`pop_scope` proposal, we could possibly depend...
@albanD @ezyang friendly ping for response. Is [this](https://github.com/pytorch/pytorch/blob/5950240bdf4fe0734ea47180dfbd9b69c7091fa9/torch/nn/modules/module.py#L52) the api for > global module hook (that fire for every Module that run) In the link, the comment states > The...
I'm updating the PR and resolving a few quick infra errors to get reasonable outputs from the unittests. Encountered this error and not sure how to proceed. I recall encountering...
Found root cause to be https://github.com/huggingface/optimum/blob/548d0ac5a2039c6ae73fda2ff53034365b1637d3/optimum/utils/import_utils.py#L61 returning False when installed package is `onnx-weekly` instead of `onnx`. Looks like similar issue was noted for onnxruntime, and onnxruntime-gpu https://github.com/huggingface/optimum/blob/548d0ac5a2039c6ae73fda2ff53034365b1637d3/optimum/utils/import_utils.py#L63-L64
Got an initial result after some tweaks to enable model patcher for dynamo. ``` === 350 failed, 360 passed, 11 skipped, 18766 warnings in 8416.29s (2:20:16) === ``` We are...