Daniel Garvey
Daniel Garvey
I haven't played around with fx transformations, but I completely agree that fundamentally we should take advantage of having access to the symbolic dims. given what you propose I wonder...
Its from this: https://github.com/llvm/torch-mlir/commit/e30a083affb65c301066eda3df7112c06f4291da#diff-d08876b67eafdb088647547d512c1b52173f7a6133f203c71a65c60c1918778b I'll try and rework the tests tomorrow
so I should just rebase yeah?

@powderluv I dont think it makes sense to add a bunch of variants of the same transformer base here, but let me know if you disagree. I was thinking we...
> LGTM, though maybe it would be good to get a number of models that rely on a certain _model architecture_, e.g. LLaMa-2, and include that as a way to...
Hey thanks for contributing. We're planning a pretty significant overhaul, so this may stop working in the near future, but I'm happy to commit it if anyone would find it...
@angelayi any progress expected or any workarounds available? Trying to pass an int as a `torch.empty([], dtype=torch.int)` feels like asking for data dependent errors
Good ones to probably add: constEval DCE (dead code elimination)
good source of glossary terms would probably be pass names