Xiang Li
Xiang Li
@ikaneshiro root@dev0:/srv/data0/lx/res# export SINGULARITY_TMPDIR=/srv/data0/lx/tmp/ && singularity -d build --pem-path=rsa_pub.pem lolcow_enc.sif lolcow.def ``` DEBUG [U=0,P=83858] persistentPreRun() Singularity version: 3.8.3+222-g2f16701e3 DEBUG [U=0,P=83858] persistentPreRun() Parsing configuration file /usr/local/singularity/etc/singularity/singularity.conf DEBUG [U=0,P=83858] handleConfDir() /root/.singularity already...
> Isn't this what is done here already? https://github.com/triton-inference-server/client/blob/fe6ccbb00e57ce91e284bc5540b29e8929670f06/src/c%2B%2B/library/CMakeLists.txt#L55 Hi, I'm facing the same issue. Have you solve it?
> @heibaidaolx123 do you mind to point me to the place that this is mentioned? I am not sure if this refers to there can only be one inference at...
@jbkyang-nvi Thanks for your advice. I tried to use multiple Java client in multiple threads to send parallel infer requests, and I got GC errors. Then I turned to use...
also looking forward to it