openai-kotlin
openai-kotlin copied to clipboard
Compatibility Issues with SpeakGPT and Groq llama3 Endpoints
Description
There are compatibility issues when using the openai-kotlin library with the SpeakGPT app, specifically with Groq endpoint. The app functions correctly, but it consistently triggers error messages that appear to be related to how exceptions are handled within the library.
See original bug report here for more context.
Steps to Reproduce
- Integrate the openai-kotlin library with the SpeakGPT app.
- Configure the app to interact with Groq endpoint.
- Observe that the app correctly processes responses but incorrectly triggers error messages.
Expected Behavior
The library should handle endpoint interactions without triggering unnecessary error messages, ensuring smooth operation across different configurations.
Actual Behavior
Error messages are triggered after every API response, despite the responses being correct and fully processed. This suggests an issue with the exception handling mechanism in the library when used with endpoints other than the default.
| Issue | Same chat, next message | API url | Model | Event log (not sure if it's relevant) |
|---|---|---|---|---|
It looks like speak-gpt app adds this text somewhere here, I guess, so it's not an issue with configuration or the API, it's an issue with error handling logic inside speak-gpt app. It fires when it shouldn't, I think.
Additional Information
- SpeakGPT app version: 3.22
- Library version: [library version is not known to me]
Possible Solution
It would be beneficial to review the exception handling logic within the library to ensure it is robust across various endpoints. Alternatively, providing more detailed documentation or configuration options to handle such cases might help.
Links
- Related issue in SpeakGPT: https://github.com/AndraxDev/speak-gpt/issues/100
Thank you for looking into this matter. Your assistance will help improve the usability of the library in diverse applications.
The library throws exceptions when known issues are returned by the OpenAI APIs. When an unexpected error occurs, a generic exception is thrown. This may vary on a case-by-case basis. Therefore, to resolve any issues of this nature, it would be best to have information about which endpoint is failing and why.
Thank you for your prompt response, @aallam.
Here are the specifics related to the endpoint and model that are causing the issue:
- API URL:
https://api.groq.com/openai/v1/ - Model Used:
llama3-70b-8192 - API Key: API keys can be obtained for free at Groq API Console.
The error messages are triggered after every API response, which are correct and as expected, but the app still shows an error message. This issue seems to be related to how exceptions are handled when using this particular endpoint and model with the openai-kotlin library.
Could we possibly look into the exception handling logic for responses from this specific API? It might help to understand why the library perceives these correct responses as errors and throws exceptions.
Thank you for assisting in improving the library's compatibility with various endpoints.
This may be related, I'm getting this exception while trying to use the completion endpoint with Groq:
Exception in thread "main" io.ktor.serialization.JsonConvertException: Illegal input: Field 'param' is required for type with serial name 'com.aallam.openai.api.exception.OpenAIErrorDetails', but it was missing at path: $.error
at io.ktor.serialization.kotlinx.KotlinxSerializationConverter.deserialize(KotlinxSerializationConverter.kt:90)
at io.ktor.serialization.ContentConverterKt$deserialize$$inlined$map$1$2.emit(Emitters.kt:224)
at kotlinx.coroutines.flow.FlowKt__BuildersKt$asFlow$$inlined$unsafeFlow$3.collect(SafeCollector.common.kt:116)
at io.ktor.serialization.ContentConverterKt$deserialize$$inlined$map$1.collect(SafeCollector.common.kt:113)
at kotlinx.coroutines.flow.FlowKt__ReduceKt.firstOrNull(Reduce.kt:243)
at kotlinx.coroutines.flow.FlowKt.firstOrNull(Unknown Source)
at io.ktor.serialization.ContentConverterKt.deserialize(ContentConverter.kt:123)
at io.ktor.client.plugins.contentnegotiation.ContentNegotiation.convertResponse$ktor_client_content_negotiation(ContentNegotiation.kt:230)
at io.ktor.client.plugins.contentnegotiation.ContentNegotiation$Plugin$install$2.invokeSuspend(ContentNegotiation.kt:262)
at io.ktor.client.plugins.contentnegotiation.ContentNegotiation$Plugin$install$2.invoke(ContentNegotiation.kt)
at io.ktor.client.plugins.contentnegotiation.ContentNegotiation$Plugin$install$2.invoke(ContentNegotiation.kt)
at io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:120)
at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SuspendFunctionGun.kt:78)
at io.ktor.client.HttpClient$4.invokeSuspend(HttpClient.kt:177)
at io.ktor.client.HttpClient$4.invoke(HttpClient.kt)
at io.ktor.client.HttpClient$4.invoke(HttpClient.kt)
at io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:120)
at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SuspendFunctionGun.kt:78)
at io.ktor.util.pipeline.SuspendFunctionGun.proceedWith(SuspendFunctionGun.kt:88)
at io.ktor.client.plugins.HttpCallValidator$Companion$install$2.invokeSuspend(HttpCallValidator.kt:142)
at io.ktor.client.plugins.HttpCallValidator$Companion$install$2.invoke(HttpCallValidator.kt)
at io.ktor.client.plugins.HttpCallValidator$Companion$install$2.invoke(HttpCallValidator.kt)
at io.ktor.util.pipeline.SuspendFunctionGun.loop(SuspendFunctionGun.kt:120)
at io.ktor.util.pipeline.SuspendFunctionGun.proceed(SuspendFunctionGun.kt:78)
at io.ktor.util.pipeline.SuspendFunctionGun.execute$ktor_utils(SuspendFunctionGun.kt:98)
at io.ktor.util.pipeline.Pipeline.execute(Pipeline.kt:77)
at io.ktor.client.call.HttpClientCall.bodyNullable(HttpClientCall.kt:89)
at com.aallam.openai.client.internal.http.HttpTransport.openAIAPIException(HttpTransport.kt:73)
at com.aallam.openai.client.internal.http.HttpTransport.handleException(HttpTransport.kt:48)
at com.aallam.openai.client.internal.http.HttpTransport.perform(HttpTransport.kt:23)
at com.aallam.openai.client.internal.http.HttpTransport$perform$1.invokeSuspend(HttpTransport.kt)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
at kotlinx.coroutines.internal.LimitedDispatcher$Worker.run(LimitedDispatcher.kt:115)
at kotlinx.coroutines.scheduling.TaskImpl.run(Tasks.kt:103)
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:584)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:793)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:697)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:684)
Caused by: kotlinx.serialization.MissingFieldException: Field 'param' is required for type with serial name 'com.aallam.openai.api.exception.OpenAIErrorDetails', but it was missing at path: $.error
at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:95)
at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableValue(AbstractDecoder.kt:43)
at kotlinx.serialization.encoding.AbstractDecoder.decodeNullableSerializableElement(AbstractDecoder.kt:78)
at com.aallam.openai.api.exception.OpenAIError$$serializer.deserialize(OpenAIErrorDetails.kt:11)
at com.aallam.openai.api.exception.OpenAIError$$serializer.deserialize(OpenAIErrorDetails.kt:11)
at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:69)
at kotlinx.serialization.json.Json.decodeFromString(Json.kt:107)
at io.ktor.serialization.kotlinx.KotlinxSerializationConverter.deserialize(KotlinxSerializationConverter.kt:82)
... 38 more
Caused by: kotlinx.serialization.MissingFieldException: Field 'param' is required for type with serial name 'com.aallam.openai.api.exception.OpenAIErrorDetails', but it was missing
at kotlinx.serialization.internal.PluginExceptionsKt.throwMissingFieldException(PluginExceptions.kt:20)
at com.aallam.openai.api.exception.OpenAIErrorDetails.<init>(OpenAIErrorDetails.kt:24)
at com.aallam.openai.api.exception.OpenAIErrorDetails$$serializer.deserialize(OpenAIErrorDetails.kt:24)
at com.aallam.openai.api.exception.OpenAIErrorDetails$$serializer.deserialize(OpenAIErrorDetails.kt:24)
at kotlinx.serialization.json.internal.StreamingJsonDecoder.decodeSerializableValue(StreamingJsonDecoder.kt:69)
... 45 more