LLocalSearch icon indicating copy to clipboard operation
LLocalSearch copied to clipboard

"Exiting chain with error: invalid character '<' looking for beginning of value"

Open cartergrobinson opened this issue 10 months ago • 4 comments

Issue:

Error "Exiting chain with error: invalid character '<' looking for beginning of value"

Description:

I have installed LLocalSearch on Ubuntu 22.04 using Docker and have successfully connected it to my Ollama instance running on the same server (not in Docker). The Ollama service is configured to listen on all interfaces and responds to API requests from other systems on the LAN.

Ollama Installation:

It can successfully connect to my Ollama instance on the same server (not in Docker). I can tell it works because it successfully used the API to pull all-minilm:v2. Also, I changed the Ollama service to listen on all interfaces (ex. Environment="OLLAMA_HOST=0.0.0.0:11434" in /etc/systemd/system/ollama.service. I verified that I get a response when I run this from another system on the LAN, which makes me think Ollama is fine:

curl http://192.168.0.80:11434/api/generate -d '{
  "model": "llama2",
  "prompt":"Why is the sky blue?"
}'

Error

However, when I use the LLocalSearch interface at http://localhost:3000, any preset or longer chat prompt I choose results in the following error:

Exiting chain with error: invalid character '<' looking for beginning of value

The backend logs show:

2024/04/04 06:23:04 INFO Starting the server
Server started at http://localhost:8080
2024/04/04 06:25:16 INFO Creating new session session=418c6f73-c3a6-4378-82cb-2881e48dda83
2024/04/04 06:25:16 INFO Starting agent chain session=418c6f73-c3a6-4378-82cb-2881e48dda83 userQuery="{Prompt:how much does obsidian sync cost? MaxIterations:30 ModelName:knoopx/hermes-2-pro-mistral:7b-q8_0 Session:418c6f73-c3a6-4378-82cb-2881e48dda83}" startTime=2024-04-04T06:25:16.250Z
Error parsing the JSON: invalid character '<' looking for beginning of value
Exiting chain with error: invalid character '<' looking for beginning of value

Simple test queries that don't trigger a search seem to work fine. For example, if I write "test," I get the expected response:

Test
The question "test" is not specific enough for me to provide an accurate response or determine if a tool is necessary. Could you please rephrase your question or provide more context so I can better assist you?

Logs:

2024/04/04 06:53:57 INFO Starting agent chain session=e0584807-dc7f-465b-871f-5b7162cafea3 userQuery="{Prompt:test MaxIterations:30 ModelName:knoopx/hermes-2-pro-mistral:7b-q8_0 Session:e0584807-dc7f-465b-871f-5b7162cafea3}" startTime=2024-04-04T06:53:57.390Z
2024/04/04 06:53:58 mem messages [{
    1. Fromat your answer (after AI:) in markdown. 
    2. You have to use your tools to answer questions. 
    3. You have to provide the sources / links you've used to answer the quesion.
    4. You may use tools more than once.
    Question: test} { # Test
The question "test" is not specific enough for me to provide an accurate response or determine if a tool is necessary. Could you please rephrase the question or provide more context so that I can better assist you?<|im_end|> <nil>}]

Docker-compose

Here is my docker-compose configuration. All I've changed is the IP for Ollama.

services:
  backend:
    image: nilsherzig/llocalsearch-backend:latest
    environment:
      # the ip / url of YOUR Ollama server
      # CHANGE THIS
      - OLLAMA_HOST=http://192.168.0.80:11434

      # the url of the chroma db
      - CHROMA_DB_URL=http://chromadb:8000

      # the url of the redis db
      - SEARXNG_DOMAIN=http://searxng:8080

      # the maximum amount of iterations the agent will run to find your answer
      - MAX_ITERATIONS=30
    networks:
      - llm_network

  frontend:
    depends_on:
      - backend
    image: nilsherzig/llocalsearch-frontend:latest
    ports:
      - '3000:4173'
    networks:
      - llm_network

  chromadb:
    image: chromadb/chroma
    networks:
      - llm_network

  redis:
    image: docker.io/library/redis:alpine
    command: redis-server --save 30 1 --loglevel warning
    networks:
      - searxng
    volumes:
      - redis-data:/data
    cap_drop:
      - ALL
    cap_add:
      - SETGID
      - SETUID
      - DAC_OVERRIDE

  searxng:
    image: docker.io/searxng/searxng:latest
    networks:
      - searxng
      - llm_network
    volumes:
      - ./searxng:/etc/searxng:rw
    environment:
      - SEARXNG_BASE_URL=https://${SEARXNG_HOSTNAME:-localhost}/
    cap_drop:
      - ALL
    cap_add:
      - CHOWN
      - SETGID
      - SETUID
    logging:
      driver: 'json-file'
      options:
        max-size: '1m'
        max-file: '1'

networks:
  llm_network:
    driver: bridge
  searxng:
    ipam:
      driver: default

volumes:
  redis-data:

I have double-checked the configuration and connections between the components, and everything seems to be set up correctly.

Could you please help me identify the cause of this error and provide guidance on how to resolve it? Let me know if you need any additional information.

Thank you for your assistance!

cartergrobinson avatar Apr 04 '24 07:04 cartergrobinson