Bee
Bee
@kalanchan this means we will need to send a request to backend to remove their access token, is that correct? What if the user is using the token somewhere else...
@dominiccooney i've applied your feedback to the PR: - [x] Use a TextDecoderStream - [x] Accumulate the decoded text into a string - [x] Break on newlines - [x] JSON.parse...
@dominiccooney @valerybugakov I've updated the PR to use the official Ollama js package for chat instead of implementing our own client. May I get your review again please?
Verified Ollama still works with the latest commit: 
This is as intended as those models are currently not supported in Edits. Should we include them but mark them as unavailable @umpox @toolmantim ?
@umpox we have a [newTestSuitePrompt](https://sourcegraph.com/github.com/sourcegraph/cody/-/blob/vscode/src/commands/execute/test-edit.ts?L103) that specifically asked the LLM not to include any imports if there is an existing test file, since the new test suite is appended to...
@AizenvoltPrime This sounds like the Pre Instruction setting we have in Cody for Chat and Edit command! Can you give this a try and let us know if this is...
> I want it just for the first chat. Because the pre-prompt is rather big in size, I don't want to waste context tokens for it on every chat message....
reassigning to @philipp-spiess since he has a PR ready.
Moving the storage layer change on the Cline side to https://github.com/cline/cline/pull/7795