model_navigator
model_navigator copied to clipboard
TensorRT-LLM Triton Backend Support
When can NAV support creating Triton Repo for this new backend? Is it on your roadmap? https://github.com/triton-inference-server/tensorrtllm_backend
@shixianc thanks for feature request. We are going to review the backend options and add the support in next release.
If there are any specific requirements you see, let us know. Thanks!
Hi team! Was this ever added? I'm looking through the release notes but cannot find support for TRT-LLM
Hi @ishandhanani. Apologize, not yet. Let us prioritize this feature and provide some ETA.
@ishandhanani maybe some questions to clarify expected behavior. Do you see this feature as generating the model store for tensorrtllm backend only (example) or you would expect that whole deployment of pre/post processing with BLS would be created (similar to this example)?
I think a good first step would be to have it generate the model repo for the trtllm backend only. In the future it would be great if we could generate the entire pre/post processing model repo @jkosek