Alex Yakubovich
Alex Yakubovich
with chunk_size = 1 I am getting an error when doing big embeddings
@lucasandre22 do you use Azure OpenAI APIs?
any updates?
@hwchase17 Hi. I still getting this error in scan: https://nvd.nist.gov/vuln/detail/CVE-2023-34540 Can we reopen?
getting for `FAISS.from_documents(data, embeddings)`: ``` Traceback (most recent call last): File "/app/scheduler/4_generate_embeddings.py", line 52, in vectors = FAISS.from_documents(data, embeddings) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/langchain/vectorstores/base.py", line 332, in from_documents return cls.from_texts(texts, embedding, metadatas=metadatas,...
OpenAI API limit is big problem. But OpenAI embeddings are not the best, so it can make sense just to use free one (see https://huggingface.co/spaces/mteb/leaderboard)
My app is in Azure environment and I was able to figure out that it happened because apps run multiple replicas of the same app, and I had to configure...
@marcklingen, yes (2.26.2). It happens only with langfuse.openai. langfuse.callback for Lnagchain works great and we can see traces.
@pamelafox, we are hosting Langfuse on Azure as Azure Web App and using Azure PostgreSQL - Flexible Server. Azure Container Apps would be another option. It was very simple to...