The model unexpectedly did not return a response, which may indicate a service issue. Please report a bug.
Type: Bug
The model unexpectedly did not return a response, which may indicate a service issue. Please report a bug.
Extension version: 0.26.7 VS Code version: Code 1.99.3 (17baf841131aa23349f217ca7c570c76ee87b957, 2025-04-15T23:18:46.076Z) OS version: Windows_NT x64 10.0.22631 Modes:
System Info
| Item | Value |
|---|---|
| CPUs | 12th Gen Intel(R) Core(TM) i5-12600K (16 x 3686) |
| GPU Status | 2d_canvas: enabled canvas_oop_rasterization: enabled_on direct_rendering_display_compositor: disabled_off_ok gpu_compositing: enabled multiple_raster_threads: enabled_on opengl: enabled_on rasterization: enabled raw_draw: disabled_off_ok skia_graphite: disabled_off video_decode: enabled video_encode: enabled vulkan: disabled_off webgl: enabled webgl2: enabled webgpu: enabled webnn: disabled_off |
| Load (avg) | undefined |
| Memory (System) | 31.78GB (11.98GB free) |
| Process Argv | --crash-reporter-id d8cff913-db2e-4c10-8b9c-64b393d59643 |
| Screen Reader | no |
| VM | 0% |
A/B Experiments
vsliv368:30146709
vspor879:30202332
vspor708:30202333
vspor363:30204092
vscod805cf:30301675
binariesv615:30325510
c4g48928:30535728
azure-dev_surveyone:30548225
2i9eh265:30646982
962ge761:30959799
h48ei257:31000450
pythontbext0:30879054
cppperfnew:31000557
dwnewjupyter:31046869
pythonrstrctxt:31112756
nativeloc2:31192216
5fd0e150:31155592
dwcopilot:31170013
6074i472:31201624
dwoutputs:31242946
customenabled:31248079
hdaa2157:31222309
copilot_t_ci:31222730
e5gg6876:31282496
pythoneinst12:31285622
bgtreat:31268568
4gafe986:31271826
31787653:31262186
3e8i5726:31271747
996jf627:31283433
useunpkgapi:31292914
7bj51361:31289155
747dc170:31275177
aj496949:31278748
aj953862:31281341
generatesymbolt:31295002
convertfstringf:31295003
gendocf:31295004
Would it please be possible to share some repro steps with us, like a file similar to the one you're using, the prompt you're using, the model you picked. Thank you!
Sometimes when using Gemini 2.5 Pro, it stopped and showed The model unexpectedly did not return a response, which may indicate a service issue. Please report a bug.
The model unexpectedly did not return a response, which may indicate a service issue. Please report a bug. Sonnet 4
Same problem on Linux machine
The model unexpectedly did not return a response, which may indicate a service issue. Please report a bug
2025-06-24 20:56:36.582 [info] [streamChoices] solution 0 returned. finish reason: [stop]
2025-06-24 20:56:36.699 [info] [fetchCompletions] Request 6938e6a5-ee96-4197-836d-41224df0af7c at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 383.25045699998736ms
2025-06-24 20:56:36.700 [info] [streamChoices] solution 0 returned. finish reason: [stop]
2025-06-24 20:56:37.133 [info] [fetchCompletions] Request 34c95e80-7b75-48c8-a4df-d7f757f53496 at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 430.7805890000891ms
2025-06-24 20:56:37.134 [info] [streamChoices] solution 0 returned. finish reason: [null]
2025-06-24 20:59:14.412 [info] [fetchCompletions] Request 6a012791-ab27-41c3-96ce-b520fb612a96 at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 955.5829320000485ms
2025-06-24 20:59:14.942 [info] [fetchCompletions] Request 191523fb-7acc-4f33-a6d1-c5ee47b579d6 at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 526.7545019998215ms
2025-06-24 20:59:14.943 [info] [streamChoices] solution 0 returned. finish reason: [stop]
2025-06-24 20:59:15.543 [info] [fetchCompletions] Request dd48d4c8-a3aa-4b44-9784-3bf54ccf9da6 at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 590.5276869998779ms
2025-06-24 20:59:15.546 [info] [streamChoices] solution 0 returned. finish reason: [null]
2025-06-24 20:59:26.065 [info] [fetchCompletions] Request dd84da7b-2995-4c81-91fc-aea04f427ff3 at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 316.92828999995254ms
2025-06-24 20:59:26.066 [info] [streamChoices] solution 0 returned. finish reason: [stop]
2025-06-24 20:59:26.516 [info] [fetchCompletions] Request d6de3bc8-5e49-4990-bce0-5421019bf1cb at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 444.430223000003ms
2025-06-24 20:59:26.517 [info] [streamChoices] solution 0 returned. finish reason: [null]
2025-06-24 20:59:26.518 [info] [ghostText] Filtered out solution matching next line
2025-06-24 21:08:25.719 [info] [fetcher] Using Helix fetcher.
2025-06-24 21:08:25.721 [info] [code-referencing] Public code references are enabled.
2025-06-24 21:08:25.723 [info] [fetcher] Using Helix fetcher.
2025-06-24 21:11:29.477 [info] [fetchCompletions] Request 82a251fa-41f7-4502-8452-e497002da9dc at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 956.5244830001611ms
2025-06-24 21:11:29.478 [info] [streamChoices] solution 0 returned. finish reason: [stop]
2025-06-24 21:11:30.066 [info] [fetchCompletions] Request 13b17b54-1f42-40d4-88fb-fc03c83721fd at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 538.1364100000355ms
2025-06-24 21:11:30.066 [info] [streamChoices] solution 0 returned. finish reason: [stop]
2025-06-24 21:11:30.532 [info] [fetchCompletions] Request 8faed9c7-7df1-40e3-b7e1-f6e0663a1304 at <https://proxy.individual.githubcopilot.com/v1/engines/gpt-4o-copilot/completions> finished with 200 status after 460.61223000008613ms
2025-06-24 21:11:30.534 [info] [streamChoices] solution 0 returned. finish reason: [null]
The models have improved a lot, and we've also made lots of good changes. On top of it, our code is now open source so we can easily share traces. Please open a new issue at https://github.com/microsoft/vscode if you still encounter this problem.