Prakalp Srivastava
Prakalp Srivastava
That could be an interesting direction @masahi! Clarification question: If we materialize the layout in such cases, but are not able to raise the transformed PrimFunc back to operator level,...
Hello @quic-sanirudh! > Are there any plans to add support for extracting op info as mentioned here at some point? Yes, extracting op info would be supported. As mentioned in...
Looking forward to the example. However, the use case that we have (graph operator level layout transformation), does not need to preserve this information in presence of fusion because graph...
Thank you everyone for the discussion yesterday and bringing up very important points. I'll summarize them below. **Q1. [@Hzfengsy] Would layout planning interfere with fusion?** For the usecases we have...
@spectrometerHBH we are implementing this. We will be sending out PRs starting next week for review.
@masahi that would be awesome! @sunggg also proposed a direction [here](https://github.com/tlc-pack/relax/issues/278). Would be great to get alignment on the approach. > UPDATE: just found that the proposal talks about relax.layout_transform...
First, it is hexagon specific. On CPU the tuned kernel output is same as untuned output. Second, I have only observed this behavior for this specific kernel. For example, after...
~In addition to that, this is definitely some incorrect transformation of untuned PrimFunc, as the two PrimFuncs shown above give different results even on CPU.~
I think I have narrowed it down to the reordering of loops. On Hexagon the following two modules which differ only in the order of loops i3 & i4 produce...
well_formed pass should also check if checked_type is defined, but seems it doesn't right now.