ether8unny

Results 14 comments of ether8unny

> If this will get implemented here, one idea of improving this concept even further would be the introduction of a hierarchy of trees. The first level would contain a...

This is actually an issue with all AI memory in general not to langchain specifically. For the ai to differentiate between the 2 conversations you need something that also supplies...

ive encountered this as well and while its not a pr or permanent code fix heres a temporary solution you can add to your code if isinstance(texts, str): texts =...

I'm loading both documents and text and then using either Redis.from_existing_index, Redis.from_texts, Redis.from_documents, and as a retriever class using retriever.get_relevant_documents all with success. No i'm not saying that it is...

also this is how my create instance looks in redis.py `# Create instance instance = cls( redis_url=redis_url, index_name=index_name, embedding_function=embedding.embed_query, content_key=content_key, metadata_key=metadata_key, vector_key=vector_key, **kwargs,` and ` instance, _ = cls.from_texts_return_keys( texts=texts,...

> also throws `ResponseError: link: no such index` , > > ``` > 251 return [doc for doc, _ in docs_and_scores] > > File [~/miniconda3/envs/biogpt/lib/python3.10/site-packages/langchain/vectorstores/redis.py:327](https://file+.vscode-resource.vscode-cdn.net/home/jimmycliff/Projects/Omic/GPT/langchain/docs/modules/indexes/vectorstores/examples/~/miniconda3/envs/biogpt/lib/python3.10/site-packages/langchain/vectorstores/redis.py:327), in Redis.similarity_search_with_score(self, query, k) >...

I'm running in stable-0.2.2 to maintain memory., with redis. git checkout -b stble-0.2.2 if you want to revert for the time being

This i think is related to the JSON issue. Somewhere "forward slash" is not escaping correctly, possibly in the fix json parts of the code, its causing additional "forward slashes"...

> @ether8unny Do you know if anyone else has been able to work around this? It doesnt seem to affect all users but specifically windows users on gpt3.5, i currently...