diffusers
diffusers copied to clipboard
Diffusers on SageMaker
Is your feature request related to a problem? Please describe. I'd like to run inference with stable diffusion through an API request!
Describe the solution you'd like Either an Inference API or Sagemaker integration!
Additional context Similar to https://huggingface.co/inference-endpoints or https://github.com/aws/sagemaker-huggingface-inference-toolkit
+1 I'd like to run pipeline.deploy() which is the standard HuggingFace way of deploying.
cc @philschmid - could this make sense?
@AmanKishore you can already use diffusers
on sagemaker with the Hugging Face inference toolkit and a inference.py
+ a requirements.txt
. Here is an example on how to do this: https://github.com/huggingface/notebooks/blob/main/sagemaker/17_custom_inference_script/sagemaker-notebook.ipynb
How do I add "diffusers" there? I'm guessing that's what the requirements.txt is for? It's not in the example... Also, how do I download the data? I need to plug in my access token somehow, just git+lfs is not enough. Where do I send the pipeline to Cuda? This should happen in the model_fn()? Thank you!
How do I add "diffusers" there? I'm guessing that's what the requirements.txt is for? It's not in the example...
You add the requirements.txt
here: https://github.com/huggingface/notebooks/tree/main/sagemaker/17_custom_inference_script/code
Also, how do I download the data?
Not sure what you mean.
I need to plug in my access token somehow, just git+lfs is not enough.
add it to the inference.py? or load it as a SSM secret?
Where do I send the pipeline to Cuda? This should happen in the model_fn()?
Yes
I managed to get it to work - thanks for the tips! I still think it should be built in to the general HuggingFace library, and deployed with a simple pipe.deploy().
cc @jeffboudier for visibility
Came across the same problem, this is big help. Thank you both :)
@MajorTal can you share your solution, I'm facing the same issue
I've made this repo with everything needed to provision a sagemaker endpoint with API gateway (for public access) of a stable diffusion using huggingface toolkit. I still need to prepare the readme with all the docs but the code is there
awsome! Thanks, @DanielKneipp, that's very helpful
How are you able to use the diffusers
library? I added it to code/requirements.txt
, and I'm getting this error:
ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (400) from model with message "{
"code": 400,
"type": "InternalServerException",
"message": "No module named \u0027diffusers\u0027"
}
I'm running this in a Sagemaker notebook instance
readme is there now! 🎉
@baldesco I'd download the zip file with the model just to be double-sure that the code/requirements.txt
is there with the diffusers
dependency set.
Hey, I also created a nice end-to-end example on how to deploy Stable Diffusion on to Amazon SageMaker: https://www.philschmid.de/sagemaker-stable-diffusion
It covers:
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.