matrix-chatgpt-bot
matrix-chatgpt-bot copied to clipboard
Switch to use Langchain
This would probably be better if it were using LangChain. It would provide more functionality. We could still set it to default to ChatGPT and I don't think we would be losing any functionality.
What does everyone think?
I know this was raised quite a while back, but I've recently been thinking it would be amazing to actually use LangChain properly.
Incorporating extra memory/context is an obvious application here, but even just running queries through a cheaper model to decide whether to spend the money on processing a response with 4 or 4.5 could be really useful.
I'm thinking, for example, in a support room running a message through a cheap model to decide if it's "a message that would be best answered by a technical engineer" - I've tried similar at work in non-Matrix settings, but would be very interested to try this in one or two public support rooms on Matrix to handle some of the simpler problems without paying for a full GPT4 response on every incoming message.
Someone did some work on this but they never put a PR in.
https://github.com/pangeachat/matrix-chatgpt-bot
It looks interesting, especially the use of davinci and embeddings... though it looks like the changes would need to be cherry-picked because there are quite a few odd references to their specific bot implementation 🤔