transformer-deploy icon indicating copy to clipboard operation
transformer-deploy copied to clipboard

How does the local model generate configuration files

Open xiaolaimeme opened this issue 3 years ago • 3 comments

Such as title,like config.json 、tokenizer.json and tokenizer_config.json

xiaolaimeme avatar Feb 15 '22 02:02 xiaolaimeme

It needs to be provided through one of these ways:

  • --model arg -> tokenizer stuff + model stuff
  • --tokenizer arg -> tokenizer stuff only

It can be a local path or HF hub.

After that the files are copied to their final destination to be used on Triton inference server. Does it answer your question?

pommedeterresautee avatar Feb 15 '22 10:02 pommedeterresautee

This problem has been solved,But thank you for your reply

------------------ 原始邮件 ------------------ 发件人: "ELS-RD/transformer-deploy" @.>; 发送时间: 2022年2月15日(星期二) 晚上6:15 @.>; @.@.>; 主题: Re: [ELS-RD/transformer-deploy] How does the local model generate configuration files (Issue #50)

It needs to be provided through one of these ways:

--model arg -> tokenizer stuff + model stuff

--tokenizer arg -> tokenizer stuff only

It can be a local path or HF hub.

After that the files are copied to their final destination to be used on Triton inference server. Does it answer your question?

— Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you authored the thread.Message ID: @.***>

xiaolaimeme avatar Feb 16 '22 00:02 xiaolaimeme

This problem has been solved,But thank you for your reply ------------------ 原始邮件 ------------------ 发件人: "ELS-RD/transformer-deploy" @.>; 发送时间: 2022年2月15日(星期二) 晚上6:15 @.>; @.@.>; 主题: Re: [ELS-RD/transformer-deploy] How does the local model generate configuration files (Issue #50) It needs to be provided through one of these ways: --model arg -> tokenizer stuff + model stuff --tokenizer arg -> tokenizer stuff only It can be a local path or HF hub. After that the files are copied to their final destination to be used on Triton inference server. Does it answer your question? — Reply to this email directly, view it on GitHub, or unsubscribe. Triage notifications on the go with GitHub Mobile for iOS or Android. You are receiving this because you authored the thread.Message ID: @.***>

Hi @xiaolaimeme can you please share the code for local model config file generation?

harishprabhala avatar Sep 07 '22 13:09 harishprabhala