BentoML
BentoML copied to clipboard
Apache Kafka integration
Is your feature request related to a problem? Please describe.
Streaming serving is a common pattern for deploying models, where the ML model is applied on streaming data sets.
Describe the solution you'd like
- Docs on an end-to-end model serving solution built with BentoML and Kafka
- Build related API & Integration in BentoML if needed
Describe alternatives you've considered
Welcome suggestions!
Additional context
- Streaming serving with Apache Spark is covered here in https://github.com/bentoml/BentoML/issues/890
- Thanks, @kaiwaehner for suggesting this. Here's a related post on this topic by Kai https://www.kai-waehner.de/blog/2020/10/27/streaming-machine-learning-kafka-native-model-server-deployment-rpc-embedded-streams/
Me and @pandafy would like to work on this issue. We're both MLH Fellows.
@awalvie @pandafy let me know if you have any questions or would like to discuss your approach & design before starting to implement it. I'm always available in the BentoML community slack and MLH discord.
What is the status on this @awalvie ?
Hey, I've moved away from the issue at the moment, please unassign me so that others can take it up.
Hi @Styren - we are looking to speak with users who are interested in this feature, let me know if you'd be open to sharing a bit more about your use case and requirements.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
This issue is listed in "Done" column in the roadmap. I think it can confuse people into thinking that this is actually implemented