LLMStack icon indicating copy to clipboard operation
LLMStack copied to clipboard

LocalAI Image generation

Open lianee opened this issue 2 years ago • 4 comments

Describe the bug LLMStack returns 'b64_json' instead of the generated image

To Reproduce Steps to reproduce the behavior:

  1. Go to Playground
  2. Click on 'LocalAI', 'Image Generations
  3. put a description
  4. put LocalAI address in the "Base Url" field, despite the fact it is already set in settings?
  5. Click "Run" and See error

2023-11-01_13-52

LocalAI successfully generated image:

12:46PM DBG Response: {"created":1698842780,"id":"1d01c64f-45c5-4253-bcb8-a9879ea818ae","data":[{"embedding":null,"index":0,"url":"http://ai.example.com:8080/generated-images/b64762071389.png"}],"usage":{"prompt_tokens":0,"completion_tokens":0,"total_tokens":0}}

b64762071389

Expected behavior Display generated image

Version v0.0.17

Environment Ubuntu 22.04 pip install llmstack LocalAI on another debian box with several models available

lianee avatar Nov 01 '23 12:11 lianee

I am interested in working on this issue, Are we still facing this ??

shubhamofbce avatar Jan 10 '24 07:01 shubhamofbce

@shubhamofbce yes, i believe this still is an issue

ajhai avatar Jan 10 '24 18:01 ajhai

local development guide is quite old, facing a lot of issues in running it locally. Do you have an updated guide ?? docker compose file is missing and when i am running the django app locally it is throwing random error. trying to fix them.

shubhamofbce avatar Jan 10 '24 18:01 shubhamofbce

yes

Jayanthudar avatar Aug 13 '24 18:08 Jayanthudar

Please use openai provider config going forward for localai. https://docs.trypromptly.com/providers#provider-configuration for more details on how to configure openai compatible API provider config.

ajhai avatar Sep 16 '24 23:09 ajhai