tvm
tvm copied to clipboard
[Bug] ValueError: Expect value to be constant int
Expected behavior
File "/tvm/python/tvm/driver/tvmc/compiler.py", line 452, in compile_model
graph_module = build(
File "/tvm/python/tvm/driver/tvmc/compiler.py", line 528, in build
return relay.vm.compile(mod, target=tvm_target, params=params)
File "/tvm/python/tvm/relay/backend/vm.py", line 67, in compile
compiler.lower(mod, target, target_host)
File "/tvm/python/tvm/relay/backend/vm.py", line 126, in lower
self._lower(mod, raw_targets)
File "/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 239, in call
raise_last_ffi_error()
File "/tvm/python/tvm/_ffi/base.py", line 481, in raise_last_ffi_error
raise py_err
File "/tvm/python/tvm/relay/op/strategy/generic.py", line 51, in wrapper
return topi_schedule(outs)
File "/tvm/python/tvm/autotvm/task/topi_integration.py", line 242, in wrapper
return topi_schedule(cfg, outs, *args, **kwargs)
File "/tvm/python/tvm/topi/cuda/batch_matmul.py", line 180, in schedule_batch_matmul
traverse_inline(s, outs[0].op, _callback)
File "/tvm/python/tvm/topi/utils.py", line 81, in traverse_inline
_traverse(final_op)
File "/tvm/python/tvm/topi/utils.py", line 79, in _traverse
callback(op)
File "/tvm/python/tvm/topi/cuda/batch_matmul.py", line 178, in _callback
_schedule(cfg, op)
File "/tvm/python/tvm/topi/cuda/batch_matmul.py", line 111, in _schedule
cfg.define_split("tile_y", y, num_outputs=3)
File "/tvm/python/tvm/autotvm/task/space.py", line 736, in define_split
return self._add_new_transform(SplitSpace, name, axes, policy, **kwargs)
File "/tvm/python/tvm/autotvm/task/space.py", line 1127, in _add_new_transform
axes = [x if isinstance(x, (VirtualAxis, Axis)) else self.axis(x) for x in axes]
File "/tvm/python/tvm/autotvm/task/space.py", line 1127, in
Environment
t5 module and the newest tvm.
Looks like this is model dependent, and would occur when there's a dynamic
-
Add more information to the error raised. This could be done by updating the
raise RuntimeError
to have the messagef"Expect value to be constant int, but expression {exp} was of type {type(exp)}"
. This would help identify which dynamic parameter in the model is causing the issue. -
Try using
tvm.dlight
for scheduling. This is a set of schedules for dynamic-shaped inputs. If your model cannot avoid having the dynamic parameter, these may be useful instead of the static schedules.