zenml
zenml copied to clipboard
Huggingface Model Deployer
Describe changes
I implemented ModelDeployer component to work with Huggingface repos.
Pre-requisites
Please ensure you have done the following:
- [x] I have read the CONTRIBUTING.md document.
- [x] If my change requires a change to docs, I have updated the documentation accordingly.
- [x] I have added tests to cover my changes.
- [x] I have based my new branch on
developand the open PR is targetingdevelop. If your branch wasn't based on develop read Contribution guide on rebasing branch to develop. - [ ] If my changes require changes to the dashboard, these changes are communicated/requested.
Types of changes
- [ ] Bug fix (non-breaking change which fixes an issue)
- [x] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to change)
- [ ] Other (add details above)
Summary by CodeRabbit
- New Features
- Introduced Huggingface integration for deploying machine learning models using Huggingface's infrastructure.
- Added support for configuring and managing Huggingface inference endpoints.
- New deployment functionality for Huggingface models, including creation, update, and stopping of model deployment services.
- Implemented a pipeline step for continuous deployment with Huggingface Inference Endpoint.
[!IMPORTANT]
Auto Review Skipped
Auto reviews are disabled on this repository.
Please check the settings in the CodeRabbit UI or the
.coderabbit.yamlfile in this repository.To trigger a single review, invoke the
@coderabbitai reviewcommand.
Walkthrough
The update introduces Huggingface integration into ZenML, allowing for deploying machine learning models using Huggingface's infrastructure. It adds necessary configurations, implements a model deployer with methods for managing deployments, and provides a deployment service for handling inference endpoints. This integration facilitates continuous deployment pipelines within ZenML, leveraging Huggingface's capabilities for model serving.
Changes
| File(s) | Change Summary |
|---|---|
src/zenml/integrations/huggingface/__init__.py |
Introduced imports and constants for Huggingface integration, including model deployer flavor and service artifact constants. Updated requirements to include "huggingface_hub". |
src/zenml/integrations/huggingface/flavors/... |
Added classes for Huggingface integration flavors, including model deployer configuration and base config. |
src/zenml/integrations/huggingface/model_deployers/... |
Launched Huggingface Model Deployer with methods for deployment management. |
src/zenml/integrations/huggingface/services/... |
Initiated Huggingface Service, implementing deployment and management functionality for Huggingface models. |
src/zenml/integrations/huggingface/steps/... |
Implemented a deployment step for continuous deployment with Huggingface, including service configuration and deployment logic. |
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?
Tips
Chat
There are 3 ways to chat with CodeRabbit:
- Review comments: Directly reply to a review comment made by CodeRabbit. Example:
I pushed a fix in commit <commit_id>.Generate unit-tests for this file.Open a follow-up GitHub issue for this discussion.
- Files and specific lines of code (under the "Files changed" tab): Tag
@coderabbitaiin a new review comment at the desired location with your query. Examples:@coderabbitai generate unit tests for this file.@coderabbitai modularize this function.
- PR comments: Tag
@coderabbitaiin a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:@coderabbitai generate interesting stats about this repository and render them as a table.@coderabbitai show all the console.log statements in this repository.@coderabbitai read src/utils.ts and generate unit tests.@coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.
CodeRabbit Commands (invoked as PR comments)
@coderabbitai pauseto pause the reviews on a PR.@coderabbitai resumeto resume the paused reviews.@coderabbitai reviewto trigger a review. This is useful when automatic reviews are disabled for the repository.@coderabbitai resolveresolve all the CodeRabbit review comments.@coderabbitai helpto get help.
Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
CodeRabbit Configration File (.coderabbit.yaml)
- You can programmatically configure CodeRabbit by adding a
.coderabbit.yamlfile to the root of your repository. - The JSON schema for the configuration file is available here.
- If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation:
# yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json
CodeRabbit Discord Community
Join our Discord Community to get help, request features, and share feedback.
@coderabbit review
@coderabbitai review
@dudeperf3ct can you check the CI errors and update as appropriate? there's some linting issues and also some docstrings to be added as far as I can see.
@strickvl I have addressed the CI errors.
Re: Docs and Testing. I have added docs for the new deployer. I am not sure about using https://github.com/huggingface/hf-endpoints-emulator repo / package for testing.
Should I implement an example under tests/integration/examples for this new deployer? As part of running this step, an endpoint should be provisioned. Should I mock creating endpoint and use the emulated endpoint instead?
@dudeperf3ct yeah that was sort of my idea. Either way works IMO, but some kind of an end-to-end example for this would be great that we can test in the integration tests runners.
@strickvl Re: CI error and dependencies. huggingface_hub dependency is added as part of hugggingface integration.
According to docs, the minimal version of huggingface_hub package supporting Inference Endpoints API is v0.19.0.
Not sure what can be done to avoid the dependency conflict on fsspec package raised while setting up the environment.
@dudeperf3ct that seems to be coming from the Azure dependency and adlfs which we have pinned down quite hard to one specific version (2021.10.0) via the Azure integration. It seems due a bump anyway. I'll work on freeing you up that way. Will make a PR but for now maybe just try https://github.com/zenml-io/zenml/blob/d216c662fb65e5abfaad6729b0f2d826f0611408/src/zenml/integrations/azure/init.py#L42-L43 as >= instead of ==?
Looks like S3FS will need to be bumped as well...
@dudeperf3ct #2402 looks on the surface of things to be ok, so maybe just add the change for S3FS into this PR and let's see if we can get it to work here too? (Hopefully will get that PR merged soon too, but in the interests of time...)