Ramiro Leal-Cavazos
Ramiro Leal-Cavazos
> it started its life as `view(1, 1, key_size, key_size)`, Do you have a link to the PyTorch model definition where they do this `view`? > I suppose the problem...
> I am actually coming via fx_importer, but I dont think the view lowering takes the strict_symbolic_shapes into account? Oh, then maybe we can fix this by modifying the importer....
@dan-garvey, are you using a similar import process as the reproducer? I think you mentioned you were using the `fx` importer rather than the `torchscript` one. > I'm curious what...
Also, the importer I'm talking about is: https://github.com/llvm/torch-mlir/blob/main/python/torch_mlir/extras/fx_importer.py Here's an example for how to use it: https://github.com/llvm/torch-mlir/blob/main/test/python/fx_importer/basic_test.py
> given what you propose I wonder if it wouldn't be better to represent all the dyanmic dims as the output of a size op? I wonder how hard it...
AFAIK, there currently isn't a way to do this in torch-mlir.
Yes, I think we can consider this stale, since there hasn't been an update in a year and a half.
> Another way to implement this is to enable/allow users to provide these shape and dtype functions in the same module that they provide their `func.func @forward(...`. I show a...
@igozali, there is some information about why use the character here: https://github.com/llvm/torch-mlir/blob/449cfb83755b97115ba7c3c27b0efdb03415810b/docs/adding_abstract_interpretation_functions.md?plain=1#L26-L27 and the rationale for using the entire operator name for each function [here](https://github.com/llvm/torch-mlir/blob/449cfb83755b97115ba7c3c27b0efdb03415810b/docs/abstract_interp_lib.md#use-of-full-operator-signatures). In short, we wanted to...
@eellison, would it be possible to get some help with the CI failure? It seems unrelated to the changes of this patch.