vscode-copilot-release
vscode-copilot-release copied to clipboard
chat history bugs out
Type: Bug
when i scroll up looking at older responses, the entire chat bugs out, some places the code snippets it supplies looses functionality, while at other times it takes a code snippet from a previous response and places in the latest.
and last experience, when asking the ai, the response window flickers between the old response and new one with every new word supplied.
Extension version: 0.14.1 VS Code version: Code 1.88.1 (e170252f762678dec6ca2cc69aba1570769a5d39, 2024-04-10T17:41:02.734Z) OS version: Windows_NT x64 10.0.19045 Modes:
System Info
Item | Value |
---|---|
CPUs | Intel(R) Core(TM) i7-6700K CPU @ 4.00GHz (8 x 4008) |
GPU Status | 2d_canvas: enabled canvas_oop_rasterization: enabled_on direct_rendering_display_compositor: disabled_off_ok gpu_compositing: enabled multiple_raster_threads: enabled_on opengl: enabled_on rasterization: enabled raw_draw: disabled_off_ok skia_graphite: disabled_off video_decode: enabled video_encode: enabled vulkan: disabled_off webgl: enabled webgl2: enabled webgpu: enabled |
Load (avg) | undefined |
Memory (System) | 47.94GB (29.52GB free) |
Process Argv | --crash-reporter-id da43d6eb-32ed-4391-8f22-53247feb5d29 |
Screen Reader | no |
VM | 0% |
A/B Experiments
vsliv368:30146709
vspor879:30202332
vspor708:30202333
vspor363:30204092
vscorecescf:30445987
vscod805:30301674
binariesv615:30325510
vsaa593cf:30376535
py29gd2263:31024239
c4g48928:30535728
azure-dev_surveyone:30548225
2i9eh265:30646982
962ge761:30959799
pythongtdpath:30769146
welcomedialog:30910333
pythonidxpt:30866567
pythonnoceb:30805159
asynctok:30898717
pythontestfixt:30902429
pythonregdiag2:30936856
pyreplss1:30897532
pythonmypyd1:30879173
pythoncet0:30885854
h48ei257:31000450
pythontbext0:30879054
accentitlementst:30995554
dsvsc016:30899300
dsvsc017:30899301
dsvsc018:30899302
cppperfnew:31000557
8082a590:30971561
fegfb526:30981948
bg6jg535:30979843
ccp1r3:30993539
dsvsc020:30976470
pythonait:31006305
gee8j676:31009558
chatpanelc:31018788
dsvsc021:30996838
jg8ic977:31013176
pythoncenvpt:31022790
i want to add, that this only occurs when the chat history unloads some of the history and then retrieves it as you scroll up
so, i just found out that the flicker is caused by having the search function open, but not related to the history unload and reload back in when scrolling up.
this is very troubling, specally u take time to teach it first and then it looses all
Does this reproduce in the latest VS Code insiders build ?
If it does, please share some specific steps to reproduce the issue
yes it still happens
its as if it tries to load the history back in, but somewhere in the embedding, it doesnt load correctly, the history does remember the entire size of the chat, but some of the embeds get resized without a scrollbar/compacted or their full original size, it happens already after about 4 to 6 replies
here is a full code snippet as an example of the original embed size
in the meantime i have found out, that opening up the copilot chat into a new window, or having editors in new windows removes its ability to "see" or register those windows, even with you selecting a code snippet and ask about it
another input, resizing the chat window itself while looking at the embed code snippet that has been compressed fixed the embed window and shows the entire code again
before resize:
after resizing side ways:
from fisforfaheem
this is very troubling, specally u take time to teach it first and then it looses all
in regards to this issue, i've done some testing myself,
i understand it to be the request sent that gets too large, like a dictionary that becomes too big in size, and it resends all data every time, reaching the token size limit, thus forgetting parts of the conversation aswell.
this can be fixed by checking a approximate token size limit, telling the user they reached it, then create a new chat or being able to select parts of the chat history for the user to delete, so the chat doesn't loose the entire context.
Yes, also The text window should be doubled give ghe nature of code.. because it won't simplly understand
On Mon, May 13, 2024, 11:28 AM Axiinos @.***> wrote:
from fisforfaheem https://github.com/fisforfaheem
this is very troubling, specally u take time to teach it first and then it looses all
in regards to this issue, i've done some testing myself,
i understand it to be the request sent that gets too large, like a dictionary that becomes too big in size, and it resends all data every time, reaching the token size limit, thus forgetting parts of the conversation aswell.
this can be fixed by checking a approximate token size limit, telling the user they reached it, then create a new chat or being able to select parts of the chat history for the user to delete, so the chat doesn't loose the entire context.
— Reply to this email directly, view it on GitHub https://github.com/microsoft/vscode-copilot-release/issues/1158#issuecomment-2106750784, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIRXJSGJFGAK45SGVI4PQ73ZCBMSLAVCNFSM6AAAAABG2XBNGGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCMBWG42TANZYGQ . You are receiving this because you commented.Message ID: @.***>
Hey @mjbvz, this issue might need further attention.
@Axiinos, you can help us out by closing this issue if the problem no longer exists, or adding more information.
I believe we've fixed this. Let me know if you're still seeing this in the latest VS Code builds
Fixed now, But the context understanding issues still persist, it simply ignore a 250+ line plus code
On Wed, Sep 11, 2024 at 9:18 PM Matt Bierner @.***> wrote:
I believe we've fixed this. Let me know if you're still seeing this in the latest VS Code builds
— Reply to this email directly, view it on GitHub https://github.com/microsoft/vscode-copilot-release/issues/1158#issuecomment-2344107534, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIRXJSEL3HWT65QEIYUDXS3ZWBUONAVCNFSM6AAAAABG2XBNGGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGNBUGEYDONJTGQ . You are receiving this because you commented.Message ID: @.***>