vim-ai icon indicating copy to clipboard operation
vim-ai copied to clipboard

IndexErrors when streaming API returns empty choices

Open jvnn opened this issue 2 years ago • 0 comments

I'm using vim-ai together with the Azure OpenAI service, which is almost compatible with the official OpenAI API but does have its quirks. I'm using a self-made proxy service in between, which is supposed to handle the differences and allow applications that aren't "Azure aware" to use pure OpenAI paths and mechanisms transparently with the Azure models.

Since updating the proxy to use the latest Azure API version, vim-ai stopped working correctly. It throws IndexErrors as the response from the API contains an empty 'choices' array, but vim-ai always assumes that there is at least one item in there. Here's an example response (via debug logging):

[2023-12-29 09:12:56.427605] [chat] response: {'id': '', 'object': '', 'created': 0, 'model': '', 'prompt_filter_results': [{'prompt_index': 0, 'content_filter_results': {'hate': {'filtered': False, 'severity': 'safe'}, 'self_harm': {'filtered': False, 'severity': 'safe'}, 'sexual': {'filtered': False, 'severity': 'safe'}, 'violence': {'filtered': False, 'severity': 'safe'}}}], 'choices': []}

I managed to get it working correctly again with the simple patch below. Sorry for not making a proper PR for this, but I really don't have the time right now (to really think through the change or do proper testing for example) so I decided to create a quick issue instead.

$ git diff
diff --git a/py/chat.py b/py/chat.py
index ff70904..a62d9fc 100644
--- a/py/chat.py
+++ b/py/chat.py
@@ -76,7 +76,10 @@ try:
         response = openai_request(url, request, http_options)
         def map_chunk(resp):
             printDebug("[chat] response: {}", resp)
-            return resp['choices'][0]['delta'].get('content', '')
+            if resp['choices']:
+                return resp['choices'][0]['delta'].get('content', '')
+            else:
+                return ""
         text_chunks = map(map_chunk, response)
         render_text_chunks(text_chunks, is_selection)
 
diff --git a/py/complete.py b/py/complete.py
index debe275..a598394 100644
--- a/py/complete.py
+++ b/py/complete.py
@@ -24,7 +24,10 @@ def complete_engine(prompt):
     response = openai_request(url, request, http_options)
     def map_chunk(resp):
         printDebug("[engine-complete] response: {}", resp)
-        return resp['choices'][0].get('text', '')
+        if resp['choices']:
+            return resp['choices'][0].get('text', '')
+        else:
+            return ""
     text_chunks = map(map_chunk, response)
     return text_chunks
 
@@ -43,7 +46,10 @@ def chat_engine(prompt):
     response = openai_request(url, request, http_options)
     def map_chunk(resp):
         printDebug("[engine-chat] response: {}", resp)
-        return resp['choices'][0]['delta'].get('content', '')
+        if resp['choices']:
+            return resp['choices'][0]['delta'].get('content', '')
+        else:
+            return ""
     text_chunks = map(map_chunk, response)
     return text_chunks

jvnn avatar Dec 29 '23 08:12 jvnn