skyagi
skyagi copied to clipboard
Integrate with langchain-serve to expose `skyagi-as-a-service`?
Would you be open to integrating skyagi with langchain-serve?
- You won't have to handle API routes yourself (we use FastAPI & Jina to handle/scale the routes).
- Can enable streaming using WebSocket & human-in-the-loop integration - this is a great addition for chat scenarios.
- Local to cloud-ready in just one command - lc-serve deploy local/jcloud
- Serverless/autoscaling endpoints with automatic tls certs on Cloud.
- Already includes babyagi-as-a-service!
Disclaimer: I'm the primary author of langchain-serve. Would be happy to collaborate on this!
Sure, sounds like an amazing idea! Let's do it! @deepankarm
Hi @deepankarm , thanks for reaching out to us. I had a brief look at the langchain-serve. One question I failed finding the answer: how does it support other serving / deployment options other than JINA cloud?
we can have a quick chat if it helps answer the question. Regards.
@qizheng7 Thanks for trying out langchain-serve.
langchain-serve tightly integrates with FastAPI & Jina. Hence for Cloud deployment, we only support Jina Cloud right now. There are a few options if you cannot use Jina Cloud directly - We can provide an option to expose docker-compose
or Kubernetes yamls. With this, users need to take care of their deployments/infrastructure.
Happy to have you on our slack for further discussion - https://jina.ai/slack/
thanks @deepankarm . We will look into the docker and K8s option of langchain-serve. And will reach you out if we have any additional question. thanks.
@qizheng7 Unfortunately, these options are hidden from the user right now, since that was not the intended UX. I'd need to add this if needed.
@deepankarm could you please point me to the codes where they sit? I can take a look at the codes first.
@deepankarm Could you point us to how to set thing up on Jina cloud? We are trying to host an instance of SkyAGI as API so that people can easily tryout and feel the excitement of generative agent in the first hand. It would be great if we can leverage Jina Cloud‘s capability to bring this up in a timely manner 🙏🙏🙏
@litanlitudan The first step is to refactor the current code and add @serving
decorator wherever the API is needed. To enable streaming and human-in-the-loop, your use case might need a WebSocket API.
Here are a few open-source examples that might help you.
-
pdfGPT's
ask
REST API -
ChatGLM's
predict
REST API -
Megabot's
ask
REST API -
Babyagi-as-a-service
baby_agi
Websocket API -
Human-in-the-loop
hitl
Websocket API
Once the API is ready, you can use lc-serve deploy commands for local & Jina AI Cloud deployments.
@qizheng7 langchain-serve
expands on Jina and Jina has features to export to docker-compose/kubernetes YAMLs. If needed, we can opt to export via a new CLI options in langchain-serve.
The websocket APIs are built on top of langchain-serve already.
Excuse my interruption, I can't find how to get docker-compose for skyagi and only this issue speaks about it. Would you please explain?
The websocket APIs are built on top of langchain-serve already.
Skyagi API, or the others mentioned above?