model_navigator icon indicating copy to clipboard operation
model_navigator copied to clipboard

TensorRT-LLM Triton Backend Support

Open shixianc opened this issue 1 year ago • 6 comments

When can NAV support creating Triton Repo for this new backend? Is it on your roadmap? https://github.com/triton-inference-server/tensorrtllm_backend

shixianc avatar Nov 15 '23 19:11 shixianc

@shixianc thanks for feature request. We are going to review the backend options and add the support in next release.

If there are any specific requirements you see, let us know. Thanks!

jkosek avatar Nov 21 '23 09:11 jkosek

Hi team! Was this ever added? I'm looking through the release notes but cannot find support for TRT-LLM

ishandhanani avatar Apr 04 '24 18:04 ishandhanani

Hi @ishandhanani. Apologize, not yet. Let us prioritize this feature and provide some ETA.

jkosek avatar Apr 04 '24 22:04 jkosek

@ishandhanani maybe some questions to clarify expected behavior. Do you see this feature as generating the model store for tensorrtllm backend only (example) or you would expect that whole deployment of pre/post processing with BLS would be created (similar to this example)?

jkosek avatar Apr 04 '24 22:04 jkosek

I think a good first step would be to have it generate the model repo for the trtllm backend only. In the future it would be great if we could generate the entire pre/post processing model repo @jkosek

ishandhanani avatar Apr 04 '24 22:04 ishandhanani