Flowise icon indicating copy to clipboard operation
Flowise copied to clipboard

[Rag application Bug] Warning TT undefined function 32

Open danielstankw opened this issue 1 year ago • 2 comments

I am trying to create super simple RAG application locally. I am local LLM model which I verified to work. The embeddings are Azure OpenAI and I write a python script to test it, and it also work. Whenever I run the pipeline I get the following error.

Saving works, but I get error (below) every time I upsert vector database.

image

2024-04-08 20:28:52 [INFO]: ⬆️ POST /api/v1/vector/internal-upsert/2c177181-8271-435e-98ff-57504e26af17
Warning: TT: undefined function: 32
2024-04-08 20:30:19 [ERROR]: Error: Connection error.
Error: Error: Connection error.
    at InMemoryVectorStore_VectorStores.upsert (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/vectorstores/InMemory/InMemoryVectorStore.js:25:27)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async buildFlow (/usr/local/lib/node_modules/flowise/dist/utils/index.js:265:17)
    at async App.upsertVector (/usr/local/lib/node_modules/flowise/dist/index.js:1698:13)
    at async /usr/local/lib/node_modules/flowise/dist/index.js:1193:13
2024-04-08 20:30:19 [ERROR]: [server]: Error: Error: Error: Connection error.
Error: Error: Error: Connection error.
    at buildFlow (/usr/local/lib/node_modules/flowise/dist/utils/index.js:326:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async App.upsertVector (/usr/local/lib/node_modules/flowise/dist/index.js:1698:13)
    at async /usr/local/lib/node_modules/flowise/dist/index.js:1193:13

danielstankw avatar Apr 08 '24 20:04 danielstankw

The error says that is not able to reach LocalAI connection. If you are running both FW and LocalAI on Docker, you might have to change the LocalAI base path - https://docs.flowiseai.com/integrations/langchain/chat-models/chatlocalai#flowise-setup

HenryHengZJ avatar Apr 09 '24 10:04 HenryHengZJ

@HenryHengZJ I am hosting my LLM using VLLM and connect to the endpoint using FastApi. I have tested before and I am able to use local LLM with no issues, so I do not see why here it would be the source of the problem. Do you have other suggestions on what could be the problem>?

danielstankw avatar Apr 09 '24 13:04 danielstankw