🐛 [Bug] Resize is currently not support in dynamic input shape compilation
Bug Description
I build the torch-Tensorrt my self at day 2022-04-05. Everything work ok unless I run the module. the log is:
WARNING: [Torch-TensorRT] - Dilation not used in Max pooling converter
WARNING: [Torch-TensorRT] - There may be undefined behavior using dynamic shape and aten::size
terminate called after throwing an instance of 'torch_tensorrt::Error'
what(): [Error thrown at core/conversion/converters/impl/shuffle.cpp:47] Resize is currently not support in dynamic input shape compilation
To Reproduce
currenly can't latter may I provide a dynamic sample.
Expected behavior
running ok for everything.
Environment
- Torch-TensorRT Version : the master newest at day 2022-04-05
- Tensorrt: 8.2.2.1
- PyTorch Version (e.g. 1.10.0): 1.10.0
- CPU Architecture: x86_64
- OS (e.g., Linux): ubuntu 18.04
- How you installed PyTorch (
conda,pip,libtorch, source): conda with pip. - Build command you used (if compiling from source):
bazel build //:libtorchtrt --compilation_mode opt
-
Are you using local sources or building from archives: true. (libtorch-1.10.0 and rensorrt-8.2.2.1)
-
Python version: 3.6
-
CUDA version: 11.1
-
GPU models and configuration:
-
Any other relevant information:
Additional context
torch usage.
std::vector<int64_t> minOps {1, dims.d[1], dims.d[2], dims.d[3]};
std::vector<int64_t> optOps {2, dims.d[1], dims.d[2], dims.d[3]};
std::vector<int64_t> maxOps {mParams->in.maxBatchSize, dims.d[1], dims.d[2], dims.d[3]};
auto input = torch_tensorrt::Input(minOps, optOps, maxOps,
mParams->in.fp16 ?torch::kFloat16: torch::kFloat32);
auto compile_settings = torch_tensorrt::ts::CompileSpec({input});
// FP16
compile_settings.enabled_precisions = {mParams->in.fp16 ?torch::kFloat16: torch::kFloat32};
...
Could anyone can help me ?
@xsacha @aaronp24 @itsliupeng @lukeyeager
This issue has not seen activity for 90 days, Remove stale label or comment or this will be closed in 10 days
This issue has not seen activity for 90 days, Remove stale label or comment or this will be closed in 10 days