v6d
v6d copied to clipboard
Health of the Vineyard project
xref: https://github.com/cncf/toc/issues/1669
Hi - I'm from the CNCF TOC and checking in on the project. In accordance with the TOC's health review process (occurring quarterly) the Vineyard project's stats are reporting a decline in activity since AUG 2024.
The TOC would like to understand if this project is active, having difficulty and/or requires assistance, or ready for archival?
For more information on archival:
https://www.cncf.io/archived-projects/ https://github.com/cncf/toc/blob/main/process/archiving.md
Please provide a response either on this issue or (preferably) the TOC's issue tracking this project's health (linked above). We want to ensure this project can return to a state of health and activity if the project team desires it or celebrate the innovation and development thus far as part of archival.
(cc the most recent maintainers for visibility @sighingnow @dashanji @siyuan0322 @vegetableysm )
/cc @sighingnow, this issus/pr has had no activity for a long time, please help to review the status and assign people to work on it.
Hi @chira001.
We would like to acknowledge the recent inactivity in the Vineyard (v6d) project. The main reason is that we have been focusing on internal development, where a series of significant enhancements and optimizations have been carried out. These improvements aim to strengthen v6d’s capabilities as a foundational infrastructure component for large-scale data processing and AI workloads — especially in the field of Large Language Models (LLMs).
While this has temporarily slowed down our open-source updates, we are now actively working on gradually open-sourcing key components and features. We firmly believe that Vineyard (v6d) holds great potential in the open-source ecosystem — especially as a core building block for next-generation AI infrastructure, such as in projects like AIBrix(https://github.com/vllm-project/aibrix) and beyond.
Our on-going efforts that will be shared with the community include:
- Enhanced remote data accessing that leverages the advanced technologies of RDMA for high-performance data transfer.
- Specialized metadata service design that are optimized for KVCache sharing LLM workloads.
- End-to-end integration with vLLM and SGLang that provides a mechanism for unified KVCache management for both distributed KVCache and P/D disaggregated inference pipelines.
Future Plans:
Focus on AI Infrastructure: We will continue optimizing the existing codebase to better support AI and machine learning workflows, with a special emphasis on LLM-related integrations. One of the key use cases we’re exploring is using Vineyard as a high-performance backend storage for KVCache Connector, enabling efficient management and sharing of attention key-value caches across distributed inference pipelines.
Strengthen Community Collaboration: We aim to actively engage with the community by collecting user feedback and encouraging more developers to contribute and collaborate, AIBrix is a great example of how we can work together to build a more robust and versatile AI infrastructure.
Thank you for your understanding and continued support. We are committed to making Vineyard (v6d) more active, valuable, and impactful moving forward, and we look forward to sharing our progress with you in the near future.
/cc @sighingnow, this issus/pr has had no activity for a long time, please help to review the status and assign people to work on it.