pranavm-nvidia

Results 29 comments of pranavm-nvidia

You can install it from the NVIDIA PyPI as [mentioned in the README](https://github.com/NVIDIA/TensorRT/tree/main/tools/onnx-graphsurgeon#using-prebuilt-wheels): > python3 -m pip install onnx_graphsurgeon --index-url https://pypi.ngc.nvidia.com

Does a URL to the wheel work? https://developer.download.nvidia.com/compute/redist/onnx-graphsurgeon/onnx_graphsurgeon-0.3.20-py2.py3-none-any.whl

You can use `slice.set_input(index, tensor)` for dynamic slice. See [the documentation](https://docs.nvidia.com/deeplearning/tensorrt/api/python_api/infer/Graph/Layers.html?highlight=islicelayer#tensorrt.ISliceLayer.set_input) for details. To create the shape, you should be able to use the `IShapeLayer` in combination with layers like...

I wrote up a short example, let me know if it helps: ```py #!/usr/bin/env python3 # Generation Command: polygraphy template trt-network -o use_dynamic_slice.py import numpy as np import tensorrt as...

Weird, maybe you missed the last two lines in the script? ```py def load_data(): return [{"input": np.arange(1 * 3 * 4 * 4, dtype=np.float32).reshape(1, 3, 4, 4)}] ``` Otherwise you...

Looks like you're referring to the 8.4.2 docs. Can you try upgrading your TensorRT version?

Unfortunately, this API wasn't exposed in TRT 7, so I don't think there's much you can do short of upgrading.

The logger output would go directly to stdout/stderr, so you'd have to capture it from there. What are you looking to do exactly? Is post-processing the logs an option?

We do plan to support it in the next release.