llm-multitool
llm-multitool copied to clipboard
Problem with integration with LM Studio local server
Hello,
I am getting the following error when I am prompting to a LM Studio local server
[GIN] 2024/03/05 - 17:49:50 | 200 | 130.618µs | 10.0.8.215 | PUT "/api/session/c50a93e4-9d00-46dc-9e64-3972a95efede/prompt" [GIN] 2024/03/05 - 17:49:50 | 200 | 186.652µs | 10.0.8.215 | PUT "/api/session/c50a93e4-9d00-46dc-9e64-3972a95efede/prompt" [GIN] 2024/03/05 - 17:49:52 | 200 | 108.349µs | 10.0.8.215 | POST "/api/session/c50a93e4-9d00-46dc-9e64-3972a95efede/response" [GIN] 2024/03/05 - 17:49:52 | 200 | 146.219µs | 10.0.8.215 | POST "/api/session/c50a93e4-9d00-46dc-9e64-3972a95efede/response" 2024/03/05 17:49:52 engine worker: enqueue 0xc0000f12c0 2024/03/05 17:49:52 OpenAiEngineBackend process(): Starting request 2024/03/05 17:49:52 OpenAiEngineBackend process(): ChatCompletionStream error: error, status code: 400, message: 2024/03/05 17:49:52 engine worker: compute done