Only OpenAI ?
https://github.com/vercel/ai-chatbot/blob/main/.env.example
According to the description below, API keys can only be obtained from open ai, but I would like to use other models such as deepseek, gemini, etc.
Or is it my misunderstanding?
You can define and use your own models in the root/lib/ai/models.ts file. While the .env.example file only shows how to set up an OpenAI key, it doesn't mean you can't use other models. Feel free to add additional models as needed.
For instance, in your .env.local file, you can define the necessary API keys:
You can define which models to use in the root/lib/ai/models.ts
file. For example:
// .env.local
DEEPSEEK_API_KEY=${YOUR_KEY}
import { openai } from '@ai-sdk/openai';
import { deepseek } from '@ai-sdk/deepseek';
import {
customProvider,
extractReasoningMiddleware,
wrapLanguageModel,
} from 'ai';
export const myProvider = customProvider({
languageModels: {
'chat-model-small': openai('gpt-4o-mini'),
// Example of Deepseek
'chat-model-large': deepseek('your-deepseek-model-name'),
'chat-model-reasoning': wrapLanguageModel({
model: openai('o3-mini'),
middleware: extractReasoningMiddleware({ tagName: 'think' }),
}),
},
});
You can also explore other providers from [@ai-sdk](https://sdk.vercel.ai/docs/introduction). Keep in mind that .env.example simply demonstrates how to use OpenAI by default; however, by modifying the project configuration slightly, you can integrate other models without issue.
Additionally, there are many providers available in @ai-sdk , so you can find more information here
You can define and use your own models in the
root/lib/ai/models.tsfile. While the.env.examplefile only shows how to set up an OpenAI key, it doesn't mean you can't use other models. Feel free to add additional models as needed.For instance, in your
.env.localfile, you can define the necessary API keys:You can define which models to use in the
root/lib/ai/models.tsfile. For example:
// .env.local DEEPSEEK_API_KEY=${YOUR_KEY}
import { openai } from '@ai-sdk/openai'; import { deepseek } from '@ai-sdk/deepseek'; import { customProvider, extractReasoningMiddleware, wrapLanguageModel, } from 'ai';
export const myProvider = customProvider({ languageModels: { 'chat-model-small': openai('gpt-4o-mini'), // Example of Deepseek 'chat-model-large': deepseek('your-deepseek-model-name'), 'chat-model-reasoning': wrapLanguageModel({ model: openai('o3-mini'), middleware: extractReasoningMiddleware({ tagName: 'think' }), }), }, });
You can also explore other providers from
[@ai-sdk](https://sdk.vercel.ai/docs/introduction). Keep in mind that.env.examplesimply demonstrates how to use OpenAI by default; however, by modifying the project configuration slightly, you can integrate other models without issue.Additionally, there are many providers available in
@ai-sdk, so you can find more information here
Hi! Is there a way to use the openai package but set a different base_url so we can use OpenAI API compatible API's?
Ok, thank you for your answers.
If different providers are available, is there a way to select them on the UI? Or is this something I would have to implement separately?
@0Chan-smc For now, you need to modify the API service provider in your code; Of course, you can also add a selection function on the UI yourself
Can I added multiple models for example openAI, Anthropic, etc and select from the model selector?