Rust version
Let's build the rust version of llmware-ai
Interesting idea ... Where would want to start with Rust? BTW, a big piece of the back-end of llmware is written in Native C (check the /libs folder). The rest is Python. Do you see a couple of targeted places where we could integrate Rust for performance (especially in areas like scalable inference) ?
@doberst Gradual Integration: Instead of rewriting the entire system in Rust, we can consider a gradual integration. let's Start with the most performance-critical parts, and then expand Rust's usage as we become more comfortable with its integration into the system . Also, Since llmware is already using Native C and Python, it's important to ensure that the Rust components can interoperate seamlessly with the existing code. Rust has good support for interoperability with C, so we can create Rust libraries that are callable from C code. what do you think?
Just brainstorming: it might be interesting to explore around production-grade inference, potentially in conjunction with GGUF models (that are already compiled in C/C++ and do not necessarily require python/pytorch) .... There are a few production-grade RUST inference server implementations - that might be a really interesting "bridge" to create?
Closing the issue - idea around Rust did not progress. The LLMWare code base is substantially in Python and C/C++ back-ends, and will likely remain so. Open to explore opportunities for selected components in Go and Rust - likely as bridges into either API/multi-threaded/web front-ends (Go) or background processes (Rust) - if there is a compelling reason to do this - and will also consider opportunities to optimize some high-volume 'loops' by pushing back into C/C++ for performance.