local model provider
i build vibesdk in local use bun run dev and bun run local with script db:migrage:local from db:migrate,
In tab config setting user, only AI model like openai and gemini can be selected, although i have configured providers as openrouter and grok
i want to ask that besides cloud model, is it possible conffigure model local like ollama ??
i searched code with key work "ollama" then see the code below, i alse check scheme DB then see table user_model_providers to config Custom OpenAI-compatible providers, so how to deloyment model local ??
export const userModelProviders = sqliteTable('user_model_providers', {
id: text('id').primaryKey(),
userId: text('user_id').notNull().references(() => users.id, { onDelete: 'cascade' }),
// Provider Details
name: text('name').notNull(), // User-friendly name (e.g., "My Local Ollama")
baseUrl: text('base_url').notNull(), // OpenAI-compatible API base URL
secretId: text('secret_id').references(() => userSecrets.id), // API key stored in userSecrets
// Status and Metadata
isActive: integer('is_active', { mode: 'boolean' }).default(true),
createdAt: integer('created_at', { mode: 'timestamp' }).default(sql`CURRENT_TIMESTAMP`),
updatedAt: integer('updated_at', { mode: 'timestamp' }).default(sql`CURRENT_TIMESTAMP`),
}, (table) => ({
userNameIdx: uniqueIndex('user_model_providers_user_name_idx').on(table.userId, table.name),
userIdx: index('user_model_providers_user_idx').on(table.userId),
isActiveIdx: index('user_model_providers_is_active_idx').on(table.isActive),
}));
Thank you for raising the issue. You can configure vibesdk to use gemini, openai, claude etc any provider that has an openai compatible API endpoint. What you need are the env var CLOUDFLARE_AI_GATEWAY_URL for setting the baseurl, and CLOUDFLARE_AI_GATEWAY_TOKEN for setting the api token.
Using this override, You can use any openai compatible server for inference, including those running on your local. Ollama has this: https://docs.ollama.com/openai
Although this is just theory and I haven't tested this yet but it should pretty much work. You might need to alter the AI Config table though (worker/agents/inferutils/config.types.ts). It should be fairly easy. Under the hood afterall it just uses openai sdk. Infact, if things don't work, you can start by overriding stuff in inferutils/core.ts as well. Let me know how it goes
We just merged #129 and #131 that should help you setup ai gateway or custom provider in a much easier and simpler way via bun run setup, along with all other essential development related setup requirements
You may try that and let me know how it goes.