server icon indicating copy to clipboard operation
server copied to clipboard

Is there a way to load LD_PRELOAD plugins dynamically?

Open sbmalik opened this issue 2 years ago • 12 comments

Description I want to load LD_PRELOAD after some computations in docker image but currently I am unable to find a solution about this. I also tried to add lines in nvidia_entrypoint.sh but that solution is also not working.

Triton Information FROM nvcr.io/nvidia/tritonserver:21.03-py3

Are you using the Triton container or did you build it yourself? Using a Container

To Reproduce Steps to reproduce the behavior. In Dockerfile

  • FROM nvcr.io/nvidia/tritonserver:21.03-py3
  • ENV LD_PRELOAD=$(some bash command)

Expected behavior I want it to load the list of plugins that I have downloaded in /plugins directory but currently I am unable to create there list and attached to environment dynamically. They may have any names so I cannot provide the constant value. It would be a great help if you can tell my where to put these paths in code (e.g. tritonserver lifecycle) so I can resolve this.

sbmalik avatar Feb 08 '22 12:02 sbmalik

@Tabrizian I've tried to edit the nvidia_entrypoint.sh but it's also not setting the environment variables?? Any suggestions?

sbmalik avatar Feb 09 '22 13:02 sbmalik

@sbmalik Did you try with a helper script that returns a string X with all .so's in /plugins with there full absolute paths? then launch server with LD_PRELOAD=X tritonserver ... ?

tanmayv25 avatar Feb 09 '22 18:02 tanmayv25

@tanmayv25 I am using the docker way and trying to load the plugin in your described way always give me this error that /opt/tritonserver/nvidia_entrypoint.sh: 71: LD_PRELOAD=xyz.so no such file or directory e.t.c.. any other thing before tritonserver command does not work

sbmalik avatar Feb 09 '22 18:02 sbmalik

@sbmalik is the shared library you are trying to load inside the docker? If not, you need to run the docker in interactive mode, copy the library into it and then run the LD_PRELOAD=X tritonserver ... command inside it instead of via the nvidia_entrypoint.sh

CoderHam avatar Feb 09 '22 19:02 CoderHam

@CoderHam that was a nice approach but I can't do this because I am working on a complete product and running the container with other containers via docker-compose. So, manually doing this would not be a solution I think.

sbmalik avatar Feb 10 '22 00:02 sbmalik

@sbmalik copy your plugins into docker images, and setting env LD_PRELOAD=/path/to/plugins in the docker-compose.yaml

austingg avatar Feb 23 '22 02:02 austingg

@austingg yes this is a good approach, but I am downloading my plugins from AWS, during the building of the docker container. The RUN command is not able to set a variable permanently and we can't use the ENV with command to set the plugins afterwards

sbmalik avatar Feb 24 '22 02:02 sbmalik

Is mounting a directory that contains the plugin while launching your container not an option?

CoderHam avatar Feb 24 '22 02:02 CoderHam

@CoderHam No, because as you know the Triton Inference Server provides a functionality of setting the model repository an AWS S3 bucket so I am loading, unloading all models from there. For these models, plugins are stored on the S3 bucket too. So, while building the container I'm syncing the directory for plugins from S3

sbmalik avatar Feb 24 '22 02:02 sbmalik

Is mounting a directory that contains the plugin while launching your container not an option?

I deploy Triton on Kubernetes, so i want to upload and syncing plugins from s3 too. But it seem impossible to achieve that

bug-developer021 avatar Jun 30 '22 12:06 bug-developer021

Is mounting a directory that contains the plugin while launching your container not an option?

Does Triton support load LD_PRELOAD plugins from S3 dynamically ?

senlyu163 avatar Jul 21 '22 12:07 senlyu163

Does Triton support load LD_PRELOAD plugins from S3 dynamically ?

Unfortunately no. The issue with preloading is that it should be performed before triton is launched.

tanmayv25 avatar Sep 07 '22 18:09 tanmayv25

Closing issue due to lack of activity. Please re-open the issue if you would like to follow up with this issue

jbkyang-nvi avatar Nov 22 '22 03:11 jbkyang-nvi

The TensorFlow backend now allows you to load custom plugins at runtime, in case that helps with the above. We've created similar functionality in some of the other backends as well such as TensorRT.

dyastremsky avatar Jul 06 '23 18:07 dyastremsky