model_analyzer icon indicating copy to clipboard operation
model_analyzer copied to clipboard

Docs for --triton-launch-mode=c_api and local missing

Open aishanibhalla opened this issue 2 years ago • 4 comments

Hi there,

I'm trying to use the model-analyzer in local mode or with c_api but when I try to navigate to the links for more information, I either get a 404 page not found or the information is missing. Could you please update the docs?

Launch mode docs - https://github.com/triton-inference-server/model_analyzer/blob/main/docs/launch_modes.md#c-api Broken c_api link - https://github.com/triton-inference-server/server/blob/main/docs/inference_protocols.md#c-api

Local mode points to the quickstart but information for that is missing - https://github.com/triton-inference-server/model_analyzer/blob/main/docs/quick_start.md

Thank you!

aishanibhalla avatar Sep 13 '22 19:09 aishanibhalla

Hi, thanks for pointing this out. The docs have been updated. Let me know if there is anything that is unclear.

tgerdesnv avatar Sep 15 '22 00:09 tgerdesnv

Thank you for updating. @tgerdesnv

I've a question regarding the model-analyzer. Is it possible to package model-analyzer and tritonclient in one docker container and run the model-analyzer? It's not quite evident from the documentation for how I can do that

aishanibhalla avatar Oct 07 '22 01:10 aishanibhalla

https://github.com/triton-inference-server/model_analyzer/blob/main/docs/install.md#specific-version-with-local-launch-mode

That will build a container based off of the TritonServer container that includes Model Analyzer and Perf Analyzer

tgerdesnv avatar Oct 07 '22 14:10 tgerdesnv

Thank you! @tgerdesnv Another follow up question: does the model-analyzer require a default config.pbtxt or if I supply a model without one, can it still generate the possible configs?

aishanibhalla avatar Oct 14 '22 23:10 aishanibhalla

Triton can auto-generate a default configuration for you in some cases (depending on the backend). If you don't specify a default config, Model analyzer will attempt to have Triton generate one.

https://github.com/triton-inference-server/server/blob/a2240c56282e7928d430d7260c860875f0e698a1/docs/user_guide/model_configuration.md#auto-generated-model-configuration

tgerdesnv avatar Oct 17 '22 15:10 tgerdesnv