Flowise icon indicating copy to clipboard operation
Flowise copied to clipboard

ChatLocalAI - Error: endpoint disabled for this model by API configuration

Open shrutifiske opened this issue 2 years ago • 11 comments

ChatLocalAI-APIError

flowise-error-log

Is there anyone else encountering the same problem with the ChatLocalAI model?

shrutifiske avatar Oct 30 '23 07:10 shrutifiske

similar issue here when trying to use LocalAi Embeddings

Cam-B avatar Nov 09 '23 02:11 Cam-B

Got the same problem

Isaac24Karat avatar Nov 20 '23 18:11 Isaac24Karat

similar issue here when trying to use LocalAi Embeddings

@Cam-B

did you solved? i am using backend: bert-embeddings :

embedding = OpenAIEmbeddings(model="text-embedding-ada-002", openai_api_base=base_path), and i have the :"endpoint disabled for this model by API configuration" error... did you managed to solve?

MarcoBoo avatar Nov 24 '23 10:11 MarcoBoo

@Cam-B

did you solved? i am using backend: bert-embeddings :

At the time I posted this, there seems to have been a known bug in the LocalAI release, I'm not sure if that was resolved yet. I switched to using Ollama for embeddings. That worked fine for me. @MarcoBoo

Cam-B avatar Nov 26 '23 08:11 Cam-B

@Cam-B Any idea where one can track the bug for any updates?

Edit: I've tried both the local AI and Ollama, for both the chatchain and embeddings and still get that "endpoint disabled for this model by API configuration" error

seanmavley avatar Dec 08 '23 23:12 seanmavley

I re-installed a new WSL, and everything is working as expected, on GPU, no endpoint errors.

seanmavley avatar Dec 18 '23 23:12 seanmavley

similar issue here when trying to use LocalAi Embeddings

rafso avatar Feb 04 '24 00:02 rafso

Same issue for me:

2024-02-19 22:25:41 [ERROR]: [server]: Error: Error: Error: 500 endpoint disabled for this model by API configuration Error: Error: Error: 500 endpoint disabled for this model by API configuration at buildLangchain (/usr/local/lib/node_modules/flowise/dist/utils/index.js:331:19) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async App.upsertVector (/usr/local/lib/node_modules/flowise/dist/index.js:1345:13) at async /usr/local/lib/node_modules/flowise/dist/index.js:1017:13

ilanb avatar Feb 19 '24 21:02 ilanb

same error here does anyone find a workaround? i red that it could come from not enough memory allocated to the docker running localai . I have 8GB allocated does someone with more have the same issue?

rstaessens avatar Feb 26 '24 14:02 rstaessens

@rstaessens had the same issues. Fixed by reinstalling my wsl from scratch

The issue is likely related to bad installation of something. That "something" is what I couldn't tell

seanmavley avatar Feb 26 '24 14:02 seanmavley

I have the same issue. here are my installation steps on cent OS 7:

==== AI Installation Steps ====

These are the steps i took so far:

  1. SSH into box

    ssh xxx.xxx.xxx.xxx

  2. install Git

    a. On Cent-OS the installer is yum sudo yum install git

    b. Verify the installation git -v

  3. Install Docker

    """Docker is the underlying platform and runtime for containers."""

    a. Find and install sudo yum search docker sudo yum install -y docker

    b. Check that docker is installed docker version

    c. Enable docker service at AMI boot time sudo systemctl enable docker.service

    d. Start docker service sudo systemctl start docker.service

    c. Docker commands

     See all available docker containers:
     	docker ps -a
     Delete a container: 
     	docker rm [container_id] --force
    
     Control:
     sudo systemctl start docker.service #<-- start the service
     sudo systemctl stop docker.service #<-- stop the service
     sudo systemctl restart docker.service #<-- restart the service
     sudo systemctl status docker.service #<-- get the service status
    
  4. Install Docker-Compose

    """Docker Compose is a tool built on top of Docker to streamline the management of multi-container applications. They work together to simplify the development, deployment, and scaling of containerized applications."""

    a. Download the Docker Compose binary into the /usr/local/bin directory: sudo curl -L "https://github.com/docker/compose/releases/latest/download/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose

    b. Provide execute permissions to the Docker Compose binary: sudo chmod +x /usr/local/bin/docker-compose

    c. After installation, verify that Docker Compose is installed correctly by running: docker-compose --version

    d. Make sure that the /usr/local/bin directory, where Docker Compose is installed, is included in your system's PATH environment variable. You can check and update your PATH by editing the ~/.bashrc or ~/.bash_profile file: export PATH="/usr/local/bin:$PATH"

    e. After updating the file, reload the shell configuration: source ~/.bashrc # or source ~/.bash_profile

    f. test by running the docker-compose command docker-compose

  5. Make a new project directory

mkdir structer cd structer

  1. Install LocalAI

    a. Clone LocalAI with git git clone https://github.com/go-skynet/LocalAI cd LocalAI

    b. Crate a new .env file touch .env

    c. Use the cat command to update and insert data into a new file cat > .env ...paste here [Ctrl+D] to save

    d. Check the docker-compose yaml file nano docker-compose.yaml

version: '3.6'

services: api: image: quay.io/go-skynet/local-ai:latest build: context: . dockerfile: Dockerfile tty: true ports: - 8080:8080 env_file: - .env volumes: - ./models:/models:cached command: ["/usr/bin/local-ai" ]

e. Build container and start LocalAI
	docker-compose up -d --pull always

f. Check that LocalAI container is up and running
	docker ps -a
  1. Change network/firewall rules to enable API Access

    a. Check firewalld status: sudo systemctl status firewalld

    b. If it displays as Active: inactive (dead) - Activate it sudo systemctl start firewalld

    c. Add Port 8080 to the Firewall Rules: To allow traffic on port 8080, use the following command: sudo firewall-cmd --zone=public --add-port=8080/tcp --permanent

    d. Reload firewalld to Apply Changes: After adding the rule, you need to reload firewalld to apply the changes: sudo firewall-cmd --reload

    e. Verify the Rule Was Added and you can see the port: sudo firewall-cmd --list-all

  2. Test API Endpoint from outside the server using Postman

    a. Open Postman and send a GET request: GET http://xxx.xxx.xxx.xxx:8080/v1/models

    Response 200: { "object": "list", "data": [] }

    b. Load a new model

    POST http://xxx.xxx.xxx.xxx:8080/models/apply with the following JSON body: { "url": "github:go-skynet/model-gallery/bert-embeddings.yaml", "name": "text-embedding-ada-002" }

    Response: { "uuid": "02388784-e13d-11ee-b9d9-0242ac120002", "status": "http://xxx.xxx.xxxx.xxxx:8080/models/jobs/02388784-e13d-11ee-b9d9-0242ac120002" }

    c. Check request status using link in Response: GET http://xxxx.xxxx.xxx.xxx:8080/models/jobs/02388784-e13d-11ee-b9d9-0242ac120002

  3. Install BERT

    The bert backend uses bert.cpp and uses ggml models. For instance you can download the ggml quantized version of all-MiniLM-L6-v2 from https://huggingface.co/skeskinen/ggml:

    a. Download huggingface repo int omodels folder/bert wget https://huggingface.co/skeskinen/ggml/resolve/main/all-MiniLM-L6-v2/ggml-model-q4_0.bin -O models/bert

    b. check the model is there by runnin local API call curl http://localhost:8080/models

    c. test the model

  4. Cent OS jq installation

    EPEL (Extra Packages for Enterprise Linux) repository, which contains additional packages not included in the default CentOS repositories. jq is available in the EPEL repository for CentOS.

    a. Enable EPEL Release repository sudo yum install epel-release

    b. Refresh package repository sudo yum update

    c. install JSON cmd processor sudo yum install jq

  5. Making requests to LocalAI

    curl http://localhost:8080/embeddings -X POST -H "Content-Type: application/json" -d '{"input": "Who are you?", "model": "text-embedding-ada-002"}' | jq "."

alexbadoi avatar Mar 13 '24 15:03 alexbadoi