TensorComprehensions
TensorComprehensions copied to clipboard
Unable to use same variable for reduction twice or more in TC
do we not allow people to use same variable twice? also repro is available in #61
FAILS:
def softmax(float(N, D) I) -> (O, tmp) {
tmp(n) max=! I(n, d)
O(n, d) = exp(I(n, d) - tmp(n))
tmp(n) +=! O(n, d)
O(n, d) = O(n, d) / tmp(n)
}
PASSES:
def softmax(float(N, D) I) -> (O, expsum, maxVal) {
maxVal(n) max= I(n, d)
expsum(n) +=! exp(I(n, d) - maxVal(n))
O(n, d) = exp(I(n, d) - maxVal(n)) / expsum(n)
}
cc @jekbradbury
the reason is because we use Halide IR and it makes assumption that the tensor dependencies are strictly DAG. Hence it doesn't work in case of softmax variants that failed.