LocalAI Image generation
Describe the bug LLMStack returns 'b64_json' instead of the generated image
To Reproduce Steps to reproduce the behavior:
- Go to Playground
- Click on 'LocalAI', 'Image Generations
- put a description
- put LocalAI address in the "Base Url" field, despite the fact it is already set in settings?
- Click "Run" and See error
LocalAI successfully generated image:
12:46PM DBG Response: {"created":1698842780,"id":"1d01c64f-45c5-4253-bcb8-a9879ea818ae","data":[{"embedding":null,"index":0,"url":"http://ai.example.com:8080/generated-images/b64762071389.png"}],"usage":{"prompt_tokens":0,"completion_tokens":0,"total_tokens":0}}
Expected behavior Display generated image
Version v0.0.17
Environment Ubuntu 22.04 pip install llmstack LocalAI on another debian box with several models available
I am interested in working on this issue, Are we still facing this ??
@shubhamofbce yes, i believe this still is an issue
local development guide is quite old, facing a lot of issues in running it locally. Do you have an updated guide ?? docker compose file is missing and when i am running the django app locally it is throwing random error. trying to fix them.
yes
Please use openai provider config going forward for localai. https://docs.trypromptly.com/providers#provider-configuration for more details on how to configure openai compatible API provider config.