Rajeev Rao
Rajeev Rao
@jiapei100 in the meantime you can build and use onnx-tensorrt parser (library only, not the onnx2trt executable) for ONNX rel-1.10.0 by specifying `-DBUILD_LIBRARY_ONLY=1` during CMake.
@SEHAIRIKamal could you please check if using TensorFlow 2.5 (as prescribed in the README) fixes the issue? cc @shuyuelan
@tanayvarshney lets followup internally if this is needed. Closing the PR.
TopK should support int32 inputs since TensorRT 8.5. @nicolasgorrity @scuizhibin please check and let us know if you are still facing issues. Thanks.
@patrickvonplaten @sayakpaul please review.
> > It should reside in optimum. Cc: @echarlaix > > You mean patching the model in optimum ? Depending on the modifications needed, it could make sense to have...
> Hi @rajeevsrao, could you share the script you used for the export ? Here is the ONNX export script for reference ``` from diffusers.models import UNetSpatioTemporalConditionModel import torch model_name...
@sayakpaul @echarlaix please suggest next steps. Thanks.