zed icon indicating copy to clipboard operation
zed copied to clipboard

Codestral (Mistral code suggestion)

Open Solido opened this issue 1 year ago • 18 comments

Check for existing issues

  • [X] Completed

Describe the feature

Support codestral from MistralAI as an equivalent of OpenAI.

Codestral support infill and VsCode plugins are already available.

https://mistral.ai/news/codestral/

Thanks!

If applicable, add mockups / screenshots to help present your vision of the feature

No response

Solido avatar May 31 '24 06:05 Solido

Additionally, It'd be amazing if we could use this for inline_completions.

universalmind303 avatar Jul 03 '24 18:07 universalmind303

Don't be too excited. Codestral is terrible at doing FIM. I have switched to asking Sonnet 3.5 to just fill in the marked part, and it does the job 10x better, even though it is a chat model and not tuned for FIM at all. Codestral can't even match the parentheses right.

NightMachinery avatar Jul 03 '24 20:07 NightMachinery

I could use Codestral model with private-gpt (fork from zylon-ai's private-gpt) in chat mode running in Docker with NVIDIA GPU support. So it would be cool if we could get it to work with zed locally.

neofob avatar Jul 10 '24 23:07 neofob

I did a basic implementation that works: https://github.com/zed-industries/zed/pull/15573

A few outstanding questions as I don't know this code base very well.

seddonm1 avatar Jul 31 '24 21:07 seddonm1

FTR, my settings for codestral:

{
  "language_models": {
    "openai": {
      "version": "1",
      "api_url": "https://codestral.mistral.ai/v1",
      "available_models": [
        { "custom": { "name": "codestral-latest", "max_tokens": 131072 } }
      ]
    }
  },
  "assistant": {
    "version": "2",
    "default_model": {
      "provider": "openai",
      "model": "codestral-latest"
    }
  },
...

bersace avatar Aug 12 '24 15:08 bersace

Note the different endpoint from regular mistral models.

bersace avatar Aug 12 '24 20:08 bersace

Note the different endpoint from regular mistral models.

Can you also use codestral as an Ollama pull?

kanelee avatar Aug 13 '24 19:08 kanelee

Note the different endpoint from regular mistral models.

Can you also use codestral as an Ollama pull?

I don't have the hardware.

bersace avatar Aug 14 '24 06:08 bersace

Codestral is too large for my machine. I’m on an M1 Mac mini 16GB of RAM. However, other, smaller Ollama Pulls Work.

On Aug 13, 2024, at 11:04 PM, Étienne BERSAC @.***> wrote:

Note the different endpoint from regular mistral models.

Can you also use codestral as an Ollama pull?

I don't have the hardware.

— Reply to this email directly, view it on GitHub https://github.com/zed-industries/zed/issues/12519#issuecomment-2287923724, or unsubscribe https://github.com/notifications/unsubscribe-auth/AADS5FNUWA2W7IQ325YQRATZRLXNHAVCNFSM6AAAAABISF6WAWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEOBXHEZDGNZSGQ. You are receiving this because you commented.

kanelee avatar Aug 14 '24 09:08 kanelee

Codestal Fill In The Middle (FIM) works like a charm on vscode with continue.dev plugin Local ollama models such as starcoder are also light and interesting.

Currently Zed does not support other model or ollama models for code completion. Is this feature planned or does it depends on commercial agreements with AI providers ?

vlebert avatar Aug 15 '24 18:08 vlebert

I have found that any model that can be pulled on Ollama works on Zed. The limitation is the user’s computer, memory, processor. Codestral is slow on my machine because I only have 16GB on my M1. Has anyone tried building a Linux AI server w tons of RAM and a GPU that you can remote into using the gateway method on Zed?On Aug 15, 2024, at 11:19 AM, vlebert @.***> wrote: Codestal Fill In The Middle (FIM) works like a charm on vscode with continue.dev plugin Local ollama models such as starcoder are also light and interesting. Currently Zed does not support other model or ollama models for code completion. Is this feature planned or does it depends on commercial agreements with AI providers ?

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you commented.Message ID: @.***>

kanelee avatar Aug 15 '24 19:08 kanelee

@kanelee they do work for assistant but how do you use a custom code completion (copilot) model ?

vlebert avatar Aug 15 '24 19:08 vlebert

That I do not know. I am just learning how to use Zed. Have you tried the online assistant?

kanelee avatar Aug 15 '24 19:08 kanelee

There is custom code completion using Maven + CoPilot. You can specify this in the settings json file.

kanelee avatar Aug 16 '24 17:08 kanelee

This is not "custom", they are the only option available in zed at the moment My point is to use codestral for code completion

vlebert avatar Aug 16 '24 17:08 vlebert

My apologies for the confusion. Meant to say that you can “customize” your assistant to either CoPilot or SuperMaven. I know that doesn’t help you. Have you put in a request on GitHub?

kanelee avatar Aug 16 '24 20:08 kanelee

Well I beleive it is actually the main topic of the current issue. Check the title :)

vlebert avatar Aug 16 '24 20:08 vlebert

Wow. Been making so many faux pas w my email responses today. Sorry 😂

kanelee avatar Aug 16 '24 20:08 kanelee

I am also interested in this feature, to run FIM with a local model. Qwen2.5-Coder does also a good job at inline completion.

tbocek avatar Oct 19 '24 20:10 tbocek

Apologies for resurrecting this issue discussion. Has there been any movement on adding Mistral alongside OpenAI and friends? #15573 seems to have done a lot of the heavy lifting on this already.

RoryLawless avatar Jan 09 '25 13:01 RoryLawless

A new version of Codestral was just released and it's much better than the previous one.

I think Codestral and its inline completions should be a first class citizen in Zed.

choucavalier avatar Jan 17 '25 11:01 choucavalier

@maxdeviant Not sure you should close this one as Codestral "code suggestion" is not exactly the same thing as Mistral in assistant panel. It is a "fill in the middle" alternative to supermaven for autocompletion.

See https://docs.mistral.ai/capabilities/code_generation/#fill-in-the-middle-endpoint

It is a completion provider implemented in Continue.dev on vscode for example

vlebert avatar Feb 14 '25 19:02 vlebert

@maxdeviant Not sure you should close this one as Codestral "code suggestion" is not exactly the same thing as Mistral in assistant panel. It is a "fill in the middle" alternative to supermaven for autocompletion.

See https://docs.mistral.ai/capabilities/code_generation/#fill-in-the-middle-endpoint

It is a completion provider implemented in Continue.dev on vscode for example

Hello, so basically you're saying that we should be able to add edit_prediction_provider: "mistral" in the future?

robikovacs avatar Feb 21 '25 09:02 robikovacs

@robikovacs Yes, Codestral to be precise

vlebert avatar Feb 21 '25 09:02 vlebert

@robikovacs Can we reopen this issue ? By the way, despite previous changes, Mistral is still not available in the assistant panel

vlebert avatar Feb 26 '25 18:02 vlebert

@robikovacs Can we reopen this issue ? By the way, despite previous changes, Mistral is still not available in the assistant panel

If you would like to see support for Mistral as a completion provider, please open a new issue.

maxdeviant avatar Feb 26 '25 18:02 maxdeviant