GPTCache
GPTCache copied to clipboard
[Feature]: Add Redis as a VectorStore
Is your feature request related to a problem? Please describe.
Redis is a really popular vector store and caching database which is used by the industry such as fintech. It will make it really easy to integrate into the existing services and APIs without adding new vector db such as FAISS and ChromaDB and include the entire pytorch into the build image. Also, it seems RedisSearch provide a efficient KNN search.
Describe the solution you'd like.
Use the redis async client, create a index that's solely for vectorescore. The index name prefix must NOT be matching other existing index names to avoid namespace overlapping, e.g. searching my-cache will also looking for entries under my-cache-gpt.
Describe an alternate solution.
No response
Anything else? (Additional Context)
I think this will be a really nice feature for those whom want to use cache without adding pure vector storage into their infra.
Thanks for your suggestion, he will be included in our plan, of course we also welcome contributions.
Hi, Is Redis integration in the pipeline now?
@Torhamilton Thanks your attention. I need some time to understand Redis, and with a lot of recent work, I haven't had the opportunity to address this yet. 😫 If you're knowledgeable about this topic, I'd appreciate your PR.
Hi @SimFG , I am working with Redis Vectorstore and langchain, I am interested to work on a PR for the Redis Vector store integration with GPT Cache
Could you assign this issue to me?
/assign @AvikantSrivastava
Great! Thanks for the contribution
Link to the PR https://github.com/zilliztech/GPTCache/pull/414
Please refer to the description of this feature request to use the redis async. Also refer to this issue for explanation: https://github.com/zilliztech/GPTCache/issues/415
@vinvcn The redis will be supported in the next version, but async is not supported for now, because this part involves some refactoring