model_navigator icon indicating copy to clipboard operation
model_navigator copied to clipboard

Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.

Results 17 model_navigator issues
Sort by recently updated
recently updated
newest added

Hi i am trying to convert the donut model which is built on PyTorch lightning and its throwing me the following error. **2022-09-27 15:02:35,649 INFO Navigator API: PyTorch to ONNX...

When can NAV support creating Triton Repo for this new backend? Is it on your roadmap? https://github.com/triton-inference-server/tensorrtllm_backend

enhancement
non-stale

This install: `pip install -U --extra-index-url https://pypi.ngc.nvidia.com triton-model-navigator[]` breaks pips parser. This works instead: `pip install -U --extra-index-url https://pypi.ngc.nvidia.com triton-model-navigator[extras]`

enhancement
non-stale

I have 2 separate questions which I could not find an answer yet, so post it here hope someone can answer: 1. When doing TRT conversion from torchscript to trt....

enhancement
non-stale

I am trying to follow the tutorial from your repository using the Triton server. However, I am encountering the following errors: # ./optimize.py Traceback (most recent call last): File "./examples/01_optimize_torch_linear_model/./optimize.py",...

get_output_metadat**e**_impl -> get_output_metadat**a**_impl