TensorComprehensions icon indicating copy to clipboard operation
TensorComprehensions copied to clipboard

Unable to use same variable for reduction twice or more in TC

Open prigoyal opened this issue 7 years ago • 2 comments

do we not allow people to use same variable twice? also repro is available in #61

FAILS:
def softmax(float(N, D) I) -> (O, tmp) {
        tmp(n) max=! I(n, d)
        O(n, d) = exp(I(n, d) - tmp(n))
        tmp(n) +=! O(n, d)
        O(n, d) = O(n, d) / tmp(n)
}

PASSES:
def softmax(float(N, D) I) -> (O, expsum, maxVal) {
        maxVal(n) max= I(n, d)
        expsum(n) +=! exp(I(n, d) - maxVal(n))
        O(n, d) = exp(I(n, d) - maxVal(n)) / expsum(n)
 }

prigoyal avatar Feb 23 '18 15:02 prigoyal

cc @jekbradbury

prigoyal avatar Feb 23 '18 15:02 prigoyal

the reason is because we use Halide IR and it makes assumption that the tensor dependencies are strictly DAG. Hence it doesn't work in case of softmax variants that failed.

prigoyal avatar Feb 23 '18 15:02 prigoyal