virtual-assistant-llm
virtual-assistant-llm copied to clipboard
Build a virtual assistant using real-time LLMs with Pathway
Table of contents
- The problem
- Solution
- Run the whole thing in 5 minutes
- Wanna learn more real-time ML?
The problem
Building a Virtual Assistant using LLMs takes a bit more work than just sending API calls to OpenAI.
As soon as you start implementing such system, you quickly realize you need to spend a lot time building a backbone of infrastructure and data-engineering boilerplate, before you can even work on your business logic.
Which means, you spend too much time on the non-differentiating elements of your products, and not enough on the differentiating things.
So the question is:
Is there a faster way to build real-world apps using LLMs?
β¦ and the answer is YES!
Solution
Pathway is an open-source framework for high-throughput and low-latency real-time data processing.
Pathway provides the backbone of services and real-time data processing, on top of which you define your business logic in Python. So you focus on the business logic, and let Pathway handle the low-level details and data engineering boiler plate.
So you build real-world LLMOps app, faster.
Letβs see how to implement our virtual assistant using Pathway.
Run the whole thing in 5 minutes
-
Install all project dependencies inside an isolated virtual env, using Python Poetry
$ make init -
Create an
.envfile and fill in the necessary credentials. You will need an OpenAI API Key, and a Discord Webhook to receive notifications.$ cp .env.example .env -
Run the virtual assistant
$ make run -
Send the first request
$ make request -
Simulate push of first event to the data warehouse
$ make push_first_event -
Simualte push of the second event to the data warehouse
$ make push_second_event
Video lecture
π Coming soon!
Wanna learn more real-time ML?
Join more than 10k subscribers to the Real-World ML Newsletter. Every Saturday morning.
