bug: Doesn't remember history when using multiple files
Describe the bug
I operate on multiple files. It doesn't remember conversation.
To reproduce
Use as usual.
- run AvanteAsk.
- Add several files .
- Write some instructions.
- Press
<C-S>. - Say "repeat same instructions"
- Press
<C-S>It won't know what you are talking about..
Expected behavior
No response
Installation method
Use lazy.nvim:
{
"yetone/avante.nvim",
event = "VeryLazy",
lazy = false,
version = false, -- set this if you want to always pull the latest change
opts = {
-- add any opts here
},
-- if you want to build from source then do `make BUILD_FROM_SOURCE=true`
build = "make",
-- build = "powershell -ExecutionPolicy Bypass -File Build.ps1 -BuildFromSource false" -- for windows
dependencies = {
"nvim-treesitter/nvim-treesitter",
"stevearc/dressing.nvim",
"nvim-lua/plenary.nvim",
"MunifTanjim/nui.nvim",
},
}
Environment
nvim v.0.10.3 windows. Shouldn't matter . Last commit from master.
Repro
vim.env.LAZY_STDPATH = ".repro"
load(vim.fn.system("curl -s https://raw.githubusercontent.com/folke/lazy.nvim/main/bootstrap.lua"))()
require("lazy.minit").repro({
spec = {
-- add any other plugins here
},
})
OK, so it seems that it is lacking memory also for single file. But I guess it has to do with me using it with our internal AI. Though it does seem strange a bit. I will close it if there are no further comments ( and it works for other people) .
OK, sorry for not sending PR . But I needed to add the first condition for it to work
M.parse_response = function(data_stream, _, opts)
if data_stream:match('%[DONE%]') then
opts.on_complete(nil)
end
if data_stream:match('"%[DONE%]":') then
opts.on_complete(nil)
return
end
if data_stream:match('"delta":') then
---@type OpenAIChatResponse
local json = vim.json.decode(data_stream)
if json.choices and json.choices[1] then
local choice = json.choices[1]
if choice.finish_reason == "stop" or choice.finish_reason == "eos_token" then
opts.on_complete(nil)
elseif choice.delta.content then
if choice.delta.content ~= vim.NIL then opts.on_chunk(choice.delta.content) end
end
end
end
end
I'm having the same issue with Claude provider... sometimes it works, but most of the time it has no idea what we just talked about.
You can try increasing max_tokens.
Having this very same issue too
I fixed the memory issue in the latest version, please update and try again.