model_analyzer
model_analyzer copied to clipboard
Docs for --triton-launch-mode=c_api and local missing
Hi there,
I'm trying to use the model-analyzer
in local
mode or with c_api
but when I try to navigate to the links for more information, I either get a 404 page not found or the information is missing. Could you please update the docs?
Launch mode docs - https://github.com/triton-inference-server/model_analyzer/blob/main/docs/launch_modes.md#c-api
Broken c_api
link - https://github.com/triton-inference-server/server/blob/main/docs/inference_protocols.md#c-api
Local mode points to the quickstart but information for that is missing - https://github.com/triton-inference-server/model_analyzer/blob/main/docs/quick_start.md
Thank you!
Hi, thanks for pointing this out. The docs have been updated. Let me know if there is anything that is unclear.
Thank you for updating. @tgerdesnv
I've a question regarding the model-analyzer. Is it possible to package model-analyzer and tritonclient in one docker container and run the model-analyzer? It's not quite evident from the documentation for how I can do that
https://github.com/triton-inference-server/model_analyzer/blob/main/docs/install.md#specific-version-with-local-launch-mode
That will build a container based off of the TritonServer container that includes Model Analyzer and Perf Analyzer
Thank you! @tgerdesnv
Another follow up question: does the model-analyzer require a default config.pbtxt
or if I supply a model without one, can it still generate the possible configs?
Triton can auto-generate a default configuration for you in some cases (depending on the backend). If you don't specify a default config, Model analyzer will attempt to have Triton generate one.
https://github.com/triton-inference-server/server/blob/a2240c56282e7928d430d7260c860875f0e698a1/docs/user_guide/model_configuration.md#auto-generated-model-configuration