ai.vim icon indicating copy to clipboard operation
ai.vim copied to clipboard

openai.lua:83: Expected comma or object end but found T_END

Open Kamilcuk opened this issue 2 years ago • 10 comments

    8 ||  /home/kamil/.vim/plugged/ai.vim/lua/_ai/openai.lua:14: in function </home/kamil/.vim/plugged/ai.vim/lua/_ai/openai.lua:13>
    7 || Error executing vim.schedule lua callback: /home/kamil/.vim/plugged/ai.vim/lua/_ai/openai.lua:83: Expected comma or object end but found T_END           at character 205
    6 || stack traceback:
    5 ||  [C]: in function 'decode'                                                                                                                    
    4 ||  /home/kamil/.vim/plugged/ai.vim/lua/_ai/openai.lua:83: in function 'on_stdout_chunk'                                                         
    3 ||  /home/kamil/.vim/plugged/ai.vim/lua/_ai/openai.lua:14: in function </home/kamil/.vim/plugged/ai.vim/lua/_ai/openai.lua:13>                   

Kamilcuk avatar Mar 25 '23 22:03 Kamilcuk

This happens in case of error return from the api. In that case, the output does not contain data: streaming lines, but just the error json.

Kamilcuk avatar Mar 25 '23 23:03 Kamilcuk

This happens to me when I try to specify vim.g.ai_completions_model = "code-davinci-002" or vim.g.ai_completions_model = "gpt-3.5-turbo" and run the normal ctl-a action in normal mode

aaronik avatar Mar 30 '23 03:03 aaronik

I got this error message when my free tier ran out. When I set up billing, the plugin started working again. I have not changed any settings. Would be nice if the plugin hinted at what could be the issue.

nicolaiskogheim avatar Apr 03 '23 11:04 nicolaiskogheim

The message comes because the response from API on error is just a json, but response on normal operation starts with data: prefix. The reading routine has to be fixed to properly handle response in case of error.

Kamilcuk avatar Apr 03 '23 11:04 Kamilcuk

I still have the same problem, although I just subscribed to GPT-plus:

Error executing vim.schedule lua callback: /Users/tothlac/.vim/bundle/ai.vim/lua/_ai/openai.lua:83: Expected comma or object end but found T_END
 at character 205
stack traceback:
        [C]: in function 'decode'
        /Users/tothlac/.vim/bundle/ai.vim/lua/_ai/openai.lua:83: in function 'on_stdout_chunk'
        /Users/tothlac/.vim/bundle/ai.vim/lua/_ai/openai.lua:14: in function </Users/tothlac/.vim/bundle/ai.vim/lua/_ai/openai.lua:13>

Any ideas?

tothlac avatar Apr 11 '23 15:04 tothlac

I just subscribed Wait 5 min and try again.

Kamilcuk avatar Apr 11 '23 15:04 Kamilcuk

It's still not working. Should I generate a new API key after subscribing to gpt plus, or do I need to do anything else to make it work again?

tothlac avatar Apr 11 '23 15:04 tothlac

@tothlac @Kamilcuk

I made some observations and debugs in the code and solved the problem as follows.

After some analysis, debugs and code revisions I made some notes a little important but that will serve as future fixes. (as a helper I used the gun itself against herself to help me with the revisions) at the end of the file will be all the corresponding prints and videos.

  1. Your environment variable OPEN_API_KEY must be properly set and make sure there is no other overwriting it in the.zshrc or.bashrc files. e.g. in your.zshrc or.bashrc :

export OPENAI_API_KEY="sk-14MH4C53R4NDR350LV3DTH3PR063M". (obviously this key dont exist).

  1. There is a memory leak as the on_stdout_chunk function can accumulate chunks in the variable buffered_chunks, but it is never emptied. To address this, you can add a size limiter to the variable and empty it when the limit is reached or at the end of the process. (I'd say this problem is of medium priority... based on the response generated by OPENAI

I made this code to debug what was happening and the result is in the prin at the end of this review

 	vim.api.nvim_err_writeln(json_str)
          local json = vim.json.decode(json_str)

          if json.error then
		on_complete(json.error.message)
		buffered_chunks = ""
          else
              on_data(json)
          end
 
  1. In the exec function, there is an error variable that is being assigned from the vim.loop.spawn call. However, error is a reserved word of the Moon and should not be used as a variable name. To fix this, we can rename the variable to something like spawn_error. I made this changes below:
         local handle
 
     local spawn_error
 
	  handle, spawn_error = vim.loop.spawn(cmd, {
         args = args,
         stdio = {nil, stdout, stderr},
     }, function (code)
         stdout:close()
         stderr:close()
         handle:close()
 
         vim.schedule(function ()
             if code ~= 0 then
                 on_complete(vim.trim(table.concat(stderr_chunks, "")))
             else
                 on_complete()
             end
         end)
     end)
 
     if not handle then
         on_complete(cmd .. " could not be started: " .. spawn_error)
     else

4.The M.completions function (line 105: openai.lua) sets a default value for the stream key of the body table. However, if the stream value has already been set in the body table, the default value is ignored. It is best to use vim.tbl_deep_extend to ensure that the table is extended correctly.

  1. The vim.tbl_extend function (line336: shared.lua , line 106 and 119: openai.lua) is used to extend a table with default values. However, it modifies the original table. If the original table needs to be kept intact, it is best to create a new table and copy the values from the original table to the new table using vim.tbl_deep_extend.

https://user-images.githubusercontent.com/98850074/231534748-e6858eb0-a32a-4552-96ce-37385cc8cf9d.mp4

-- Errors raised Captura de tela de 2023-04-12 12-03-34

Captura de tela de 2023-04-12 13-22-29

I will not make a pull request, because I believe that in the matter of correctly exporting the environment variavel would be a matter of user to user enqunato the other errors were debugged only by me in my machine without any kind of tests before, I will leave this issue to the created itself to take a look at the errors and solutions that I mentioned earlier, careful not to let these mistakes happen again. sorry for the bad english haha

washonrails avatar Apr 12 '23 17:04 washonrails

I'm getting this error only from visual mode. Changing ai_edits_model has not made a difference so far.

fneu avatar Aug 05 '23 20:08 fneu

It looks like the edits API has been deprecated by OpenAI: https://openai.com/blog/gpt-4-api-general-availability

We'll need to look into porting parts of the plugin to the Chat Completions API.

aduros avatar Aug 06 '23 22:08 aduros