determined
determined copied to clipboard
🤔 model registry - inference with pytorch model
trafficstars
Describe your question
Hello! I have a question with respect to a behavior of python library to interact with determined.ai. Specifically, I want to be able to download a specific pytorch model from the model registry of determined.ai to do real-time inference. Following the documentation it is not clear how to do this.
Checklist
- [X] Did you search the docs for a solution?
- [X] Did you search github issues to find if somebody asked this question before?
If you have the name of the model in the model registry and the model version number:
from determined.experimental import client
model_name = ...
version_num = ...
# checkpoint_dir is the path that contains the downloaded checkpoint
# By default checkpoint_dir will be checkpoints/<checkpoint_uuid> in the current working directory.
# You can download to a custom path by passing in path=<path> into the download function.
checkpoint_dir = client.get_model(model_name).get_version(version_num).checkpoint.download()