SimFG
SimFG
Got it. When running this script, the network should be disconnected, which means that when you need to access openai, an error will be reported, which means that the cache...
because the LlamaIndex has remove this param
maybe you can try to use the `cache.import_data()`, reference: https://github.com/zilliztech/GPTCache/blob/main/tests/integration_tests/examples/sqlite_faiss_mock/test_example_sqlite_faiss.py
same issue: #576
@daniiarabdiev thanks your attention. I checked carefully and there is no such switch yet.
https://gptcache.readthedocs.io/en/latest/references/embedding.html#module-gptcache.embedding.langchain The doc will help you to solve your problem.
You only need to deal with pydantic's check of attributes in the class, and naturally you can use GPTCache. Or you can build an openai proxy service and use GPTCache...
The error is ?
You can try to clean the cache dir. When using cache, please keep the same embedding. If you change the embedding method, you need to delete the previous cache directory.
You need to confirm where the 1772-dimensional vector comes from