blog
blog copied to clipboard
Private huggingface models with the Inference Container for Amazon SageMaker
Is there a method to deploy private models with the inference container announced in the blog post here? https://huggingface.co/blog/sagemaker-huggingface-llm