Binyang Li
Binyang Li
According to openAI doc: https://platform.openai.com/docs/api-reference/chat/create#chat-create-stream_options. The API provide the `stream_options` which can get token usage info for stream request. Please support this option for better rate-limit control
When running large scale job,, clone code will encounter: `requested URL returned error: 429`, Need to enhance git plugin to handle this case and let task always retry.
# Current situation: Currently, opnepai-runtime is tightly coupled with PAI and Framework Controller. We just split the code but some logic is mixed. To use runtime, we need to use...
Currently, runtimeUnkonwnError will be treated as runtimeAbortExit error. We should distinguish these two errors.
Add keep order option for output plugin. This option will help user send logs to the storage server which can not sort log lines according to timestamp. Such as AzureBlob/S3....