zenml icon indicating copy to clipboard operation
zenml copied to clipboard

Huggingface Model Deployer

Open dudeperf3ct opened this issue 1 year ago • 10 comments

Describe changes

I implemented ModelDeployer component to work with Huggingface repos.

Pre-requisites

Please ensure you have done the following:

  • [x] I have read the CONTRIBUTING.md document.
  • [x] If my change requires a change to docs, I have updated the documentation accordingly.
  • [x] I have added tests to cover my changes.
  • [x] I have based my new branch on develop and the open PR is targeting develop. If your branch wasn't based on develop read Contribution guide on rebasing branch to develop.
  • [ ] If my changes require changes to the dashboard, these changes are communicated/requested.

Types of changes

  • [ ] Bug fix (non-breaking change which fixes an issue)
  • [x] New feature (non-breaking change which adds functionality)
  • [ ] Breaking change (fix or feature that would cause existing functionality to change)
  • [ ] Other (add details above)

Summary by CodeRabbit

  • New Features
    • Introduced Huggingface integration for deploying machine learning models using Huggingface's infrastructure.
    • Added support for configuring and managing Huggingface inference endpoints.
    • New deployment functionality for Huggingface models, including creation, update, and stopping of model deployment services.
    • Implemented a pipeline step for continuous deployment with Huggingface Inference Endpoint.

dudeperf3ct avatar Jan 30 '24 16:01 dudeperf3ct

[!IMPORTANT]

Auto Review Skipped

Auto reviews are disabled on this repository.

Please check the settings in the CodeRabbit UI or the .coderabbit.yaml file in this repository.

To trigger a single review, invoke the @coderabbitai review command.

Walkthrough

The update introduces Huggingface integration into ZenML, allowing for deploying machine learning models using Huggingface's infrastructure. It adds necessary configurations, implements a model deployer with methods for managing deployments, and provides a deployment service for handling inference endpoints. This integration facilitates continuous deployment pipelines within ZenML, leveraging Huggingface's capabilities for model serving.

Changes

File(s) Change Summary
src/zenml/integrations/huggingface/__init__.py Introduced imports and constants for Huggingface integration, including model deployer flavor and service artifact constants. Updated requirements to include "huggingface_hub".
src/zenml/integrations/huggingface/flavors/... Added classes for Huggingface integration flavors, including model deployer configuration and base config.
src/zenml/integrations/huggingface/model_deployers/... Launched Huggingface Model Deployer with methods for deployment management.
src/zenml/integrations/huggingface/services/... Initiated Huggingface Service, implementing deployment and management functionality for Huggingface models.
src/zenml/integrations/huggingface/steps/... Implemented a deployment step for continuous deployment with Huggingface, including service configuration and deployment logic.

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share

Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit-tests for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit tests for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit tests.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • The JSON schema for the configuration file is available here.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

CodeRabbit Discord Community

Join our Discord Community to get help, request features, and share feedback.

coderabbitai[bot] avatar Jan 30 '24 16:01 coderabbitai[bot]

@coderabbit review

htahir1 avatar Jan 30 '24 16:01 htahir1

@coderabbitai review

htahir1 avatar Jan 30 '24 22:01 htahir1

@dudeperf3ct can you check the CI errors and update as appropriate? there's some linting issues and also some docstrings to be added as far as I can see.

strickvl avatar Feb 05 '24 13:02 strickvl

@strickvl I have addressed the CI errors.

Re: Docs and Testing. I have added docs for the new deployer. I am not sure about using https://github.com/huggingface/hf-endpoints-emulator repo / package for testing.

Should I implement an example under tests/integration/examples for this new deployer? As part of running this step, an endpoint should be provisioned. Should I mock creating endpoint and use the emulated endpoint instead?

dudeperf3ct avatar Feb 05 '24 14:02 dudeperf3ct

@dudeperf3ct yeah that was sort of my idea. Either way works IMO, but some kind of an end-to-end example for this would be great that we can test in the integration tests runners.

strickvl avatar Feb 06 '24 08:02 strickvl

@strickvl Re: CI error and dependencies. huggingface_hub dependency is added as part of hugggingface integration.

According to docs, the minimal version of huggingface_hub package supporting Inference Endpoints API is v0.19.0.

Not sure what can be done to avoid the dependency conflict on fsspec package raised while setting up the environment.

image

dudeperf3ct avatar Feb 06 '24 09:02 dudeperf3ct

@dudeperf3ct that seems to be coming from the Azure dependency and adlfs which we have pinned down quite hard to one specific version (2021.10.0) via the Azure integration. It seems due a bump anyway. I'll work on freeing you up that way. Will make a PR but for now maybe just try https://github.com/zenml-io/zenml/blob/d216c662fb65e5abfaad6729b0f2d826f0611408/src/zenml/integrations/azure/init.py#L42-L43 as >= instead of ==?

strickvl avatar Feb 06 '24 09:02 strickvl

Looks like S3FS will need to be bumped as well...

strickvl avatar Feb 06 '24 10:02 strickvl

@dudeperf3ct #2402 looks on the surface of things to be ok, so maybe just add the change for S3FS into this PR and let's see if we can get it to work here too? (Hopefully will get that PR merged soon too, but in the interests of time...)

strickvl avatar Feb 06 '24 16:02 strickvl