DevoxxGenieIDEAPlugin icon indicating copy to clipboard operation
DevoxxGenieIDEAPlugin copied to clipboard

DevoxxGenieIDEAPlugin won't work with Open WebUI API

Open somera opened this issue 4 months ago • 4 comments

Hi,

I try configure DevoxxGenieIDEAPlugin v0.6.9 with Open WebUI API (https://docs.openwebui.com/getting-started/api-endpoints/).

Image

Ollama URL is working fine

The Open WebUI API is working too:

curl -H "Authorization: Bearer XXXXXXX" http://docker-tools:3000/api/models

with curl. I get an json response with all models.

But it is not working in DevoxxGenieIDEAPlugin.

I get

Image

When I set Custom OpenAI Modell=devstral:24b I get

Image
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 22:01:39.977 [prompt-exec-6] ERROR c.d.g.s.p.error.PromptErrorHandler - Error occurred while processing chat message
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - java.util.concurrent.CompletionException: com.devoxx.genie.service.prompt.error.ModelException: Provider unavailable: Invalid HTTP request received.
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at java.base/java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:315)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at java.base/java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:320)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1770)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at java.base/java.lang.Thread.run(Thread.java:1583)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - Caused by: com.devoxx.genie.service.prompt.error.ModelException: Provider unavailable: Invalid HTTP request received.
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at com.devoxx.genie.service.prompt.response.nonstreaming.NonStreamingPromptExecutionService.processChatMessage(NonStreamingPromptExecutionService.java:206)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at com.devoxx.genie.service.prompt.response.nonstreaming.NonStreamingPromptExecutionService.lambda$executeQuery$0(NonStreamingPromptExecutionService.java:75)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1768)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	... 3 common frames omitted
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - Caused by: dev.langchain4j.exception.InvalidRequestException: Invalid HTTP request received.
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.internal.ExceptionMapper$DefaultExceptionMapper.mapHttpStatusCode(ExceptionMapper.java:69)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.internal.ExceptionMapper$DefaultExceptionMapper.mapException(ExceptionMapper.java:42)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.internal.ExceptionMapper.withExceptionMapper(ExceptionMapper.java:29)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.internal.RetryUtils.lambda$withRetryMappingExceptions$2(RetryUtils.java:307)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.internal.RetryUtils$RetryPolicy.withRetry(RetryUtils.java:195)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.internal.RetryUtils.withRetry(RetryUtils.java:247)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.internal.RetryUtils.withRetryMappingExceptions(RetryUtils.java:307)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.internal.RetryUtils.withRetryMappingExceptions(RetryUtils.java:291)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.model.openai.OpenAiChatModel.doChat(OpenAiChatModel.java:151)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.model.chat.ChatLanguageModel.chat(ChatLanguageModel.java:47)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.service.DefaultAiServices$1.invoke(DefaultAiServices.java:224)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at com.devoxx.genie.service.prompt.response.nonstreaming.$Proxy251.chat(Unknown Source)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at com.devoxx.genie.service.prompt.response.nonstreaming.NonStreamingPromptExecutionService.processChatMessage(NonStreamingPromptExecutionService.java:183)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	... 5 common frames omitted
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - Caused by: dev.langchain4j.exception.HttpException: Invalid HTTP request received.
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.http.client.jdk.JdkHttpClient.execute(JdkHttpClient.java:51)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.model.openai.internal.SyncRequestExecutor.execute(SyncRequestExecutor.java:20)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.model.openai.internal.RequestExecutor.execute(RequestExecutor.java:39)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.model.openai.OpenAiChatModel.lambda$doChat$3(OpenAiChatModel.java:152)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	at dev.langchain4j.internal.ExceptionMapper.withExceptionMapper(ExceptionMapper.java:27)
2025-08-04 22:01:39,977 [ 379028]   INFO - STDOUT - 	... 15 common frames omitted
2025-08-04 22:01:39,978 [ 379029]   INFO - STDOUT - 22:01:39.978 [prompt-exec-5] ERROR c.d.g.s.prompt.error.PromptException - ERROR:Null response received - false
  1. Is my configuration wrong?
  2. Is there an bug?

somera avatar Aug 04 '25 20:08 somera

Can you give a bit more info about your setup so we can try to simulate it?

stephanj avatar Aug 11 '25 07:08 stephanj

I'm using:

  • IDEA 2025.2 Ultimate Edition
  • DevoxxGenieIDEAPlugin v0.6.9
  • Ollama v0.11.4 -> if I'm using Ollama directly in DevoxxGenieIDEAPlugin is work's fine
  • Open WebUI v0.6.22 -> if I'm using Open WebUI API (is compatible to OpenAI API) it isnt's working or my cnfiguration is wrong. Same problems with older Open WebUI versions.

somera avatar Aug 11 '25 09:08 somera

You also need to enable Custom OpenAI HTTP 1.1. After this, it worked on my machine :-)

mydeveloperplanet avatar Aug 11 '25 18:08 mydeveloperplanet

@mydeveloperplanet my Open WebUI is running in an VM on Ubuntu 24.04.3.

curl test:

$ curl -H "Authorization: Bearer sk-5a2aee5409074a4898aa37448fb616e9" http://docker-tools:3000/api/models
{"data":[{"id":"gemma3n:e4b","name":"gemma3n:e4b","object":"model","created":1754949560,"owned_by":"ollama","ollama":{"name":"gemma3n:e4b","model":"gemma3n:e4b","modified_at":"2025-08-09T22:30:19.435638575+02:00","size":7547589116,"digest":"15cb39fd9394fd2549f6df9081cfc84dd134ecf2c9c5be911e5629920489ac32","details":{"parent_model":"","format":"gguf","family":"gemma3n","families":["gemma3n"],"parameter_size":"6.9B","quantization_level":"Q4_K_M"} ...

work's fine.

In IDEA:

Image

My local setup. No proxy.

But ...

Image Image

I don't see any model.

I tested this with IDEA and PyCharm.

It works only if the Custom OpenAPI Model is set.

Image

Which make it not usable if /api/models is working.

somera avatar Aug 11 '25 20:08 somera