[Bug] [Shape & Type] No error for incompatible shape and type between LHS and RHS of binding
I expected the following module to throw compilation error, but there doesn't seem to be any static shape or type checking between LHS and RHS of a binding.
def test_check_binding():
@tvm.script.ir_module
class InputMod:
@R.function
def f(x: Tensor((2, 3), "float32")):
with R.dataflow():
# The var shape annotation has incorrect ndim, dtype and shape. But no error is thrown.
z: Tensor((3, 2, 5), "int64") = x
R.output(z)
return z
InputMod.show()
print(InputMod["f"].body.blocks[0].bindings[-1].var.checked_type) # Tensor[ndim=3, "int64"]
print(InputMod["f"].body.blocks[0].bindings[-1].value.checked_type) # Tensor[ndim=2, "float32"]
print("Well Formed: ", "True" if relax.analysis.well_formed(InputMod) == 1 else "False") # Well Formed: True
Right, in this case, we should throw an error when parsing. cc @yongwww
We need to decide on if we want to do explicit type/shape casting(for example through match_shape) or have a sugar on the frontend and the parser inserts match_shape. Related thread: https://github.com/tlc-pack/relax/issues/222
I think the dtype and rank mismatch should be caught at compile time, even if we can have a dynamic check for shape.
For casting dtype, it might be more reasonable to wonder about whether the conversion should be implicit or explicit. I think Numpy and some other tensor libraries complain about dtype mismatches and don't implicitly convert.