Readme says it supports all other models?
Is this actually true? I don't think so with the custom features added that seem to be OpenAI specific:
Model Providers This template ships with OpenAI gpt-4o as the default. However, with the AI SDK, you can switch LLM providers to OpenAI, Anthropic, Cohere, and many more with just a few lines of code.
How do use different models without any errors ? @jb-dev1
definitely feels a bit weird! Looks like we need to edit the lib/ai/index.ts with the correct provider to connect to different models - eg. for mistral that would be
import { mistral } from '@ai-sdk/mistral'; // Add this import
export const customModel = (apiIdentifier: string) => {
return wrapLanguageModel({
model: mistral(apiIdentifier), // previously model: openai(apiIdentifier),
middleware: customMiddleware,
});
};
for azure,
export const customModel = (apiIdentifier: string) => {
return wrapLanguageModel({
model: azure(apiIdentifier),
middleware: customMiddleware,
});
};
Another way could be to add if statements and check for model ids and return the appropriate custom model - eg.
if (apiIdentifier.includes('geminiIdentifier')) {
return wrapLanguageModel({
model: createGeminiModel(apiIdentifier),
middleware: customMiddleware,
});
}
else if (apiIdentifier.includes('mistral') || apiIdentifier.includes('ministral')) {
return wrapLanguageModel({
model: mistral(apiIdentifier),
middleware: customMiddleware,
});
}
else if (apiIdentifier.includes('gpt')) {
return wrapLanguageModel({
model: openai(apiIdentifier),
middleware: customMiddleware,
});
}
Make sure the new model is added into the models.ts file correctly -
export const models: Array<Model> = [
{
id: 'gpt-4o-mini',
label: 'GPT 4o mini',
apiIdentifier: 'gpt-4o-mini',
description: 'Small model for fast, lightweight tasks',
},
{
id: 'ministral-3b-latest',
label: 'Ministral-3b-latest',
apiIdentifier : 'ministral-3b-latest',
description: 'Small model served by Mistral',
},
I also found this PR that adds multi-provider support but hasn't been merged - https://github.com/vercel/ai-chatbot/pull/547 (Adds Gemini 1.5 Flash Model - integrates Google's Gemini as an alternative to OpenAI, expanding model choices for users)
I've added anthropic following your instructions but it interacts weirdly with the "canvas". It looks like writes and after loops correcting itself as if you asked it to do so, rewriting the whole canvas twice and ruining it