server
server copied to clipboard
add the -L parameter to fix #5934
as title.
Should the target branch be main
instead of r23.05
?
Should the target branch be
main
instead ofr23.05
?
Shall I switch the target branch to main
now?
Honestly I don't see a reason why we need to change the instructions by adding -L/--location
key.
-L/--location (HTTP/HTTPS) If the server reports that the requested page has moved to a different location (indicated with a Location: header and a 3XX response code), this option will make curl redo the request on the new place. If used together with -i/--include or -I/--head, headers from all requested pages will be shown. When authentication is used, curl only sends its credentials to the initial host. If a redirect takes curl to a different host, it won't be able to intercept the user+password. See also --location-trusted on how to change this. You can limit the amount of redirects to follow by using the --max-redirs option. When curl follows a redirect and the request is not a plain GET (for example POST or PUT), it will do the following request with a GET if the HTTP response was 301, 302, or 303. If the response code was any other 3xx code, curl will re-send the following request using the same unmodified method.
We had never observed redirection request on this instruction.
@is are you faced any issues on this request/instruction layer?
cc: @GuanLuo
Hi @mc-nv In china, all download request to https://developer.download.nvidia.com/ will be redirect to https://developer.download.nvidia.cn/, so need the -L option to follow the redirection.
@is Sorry for the late response, yes, please switch the target branch to main
. You may need to rebase / resolve merge conflicts if GitHub reports any.
@is Sorry for the late response, yes, please switch the target branch to
main
. You may need to rebase / resolve merge conflicts if GitHub reports any.
I rebase the PR onto last main branch.
Looks like this is also a duplicate of https://github.com/triton-inference-server/server/pull/6099
@is did you fill out a CLA as described here: https://github.com/triton-inference-server/server/blob/main/CONTRIBUTING.md#contributor-license-agreement-cla?
Hi @rmccorm4, From the relevant discussions I've seen, it seems you want to address the issue at a more fundamental level. Does this mean there's no urgency to sign the protocol for now?