ai-sdk-provider-gemini-cli
ai-sdk-provider-gemini-cli copied to clipboard
Vercel AI SDK community provider for Gemini CLI - Free use via Gemini Code Assist License
AI SDK Provider for Gemini CLI
A community provider for the Vercel AI SDK that enables using Google's Gemini models through @google/gemini-cli-core and Google Cloud Code endpoints.
Version Compatibility
| Provider Version | AI SDK Version | NPM Tag | Branch |
|---|---|---|---|
| 2.x | v6 | latest |
main |
| 1.x | v5 | ai-sdk-v5 |
ai-sdk-v5 |
| 0.x | v4 | ai-sdk-v4 |
ai-sdk-v4 |
# AI SDK v6 (default)
npm install ai-sdk-provider-gemini-cli ai
# AI SDK v5
npm install ai-sdk-provider-gemini-cli@ai-sdk-v5 ai@^5.0.0
# AI SDK v4
npm install ai-sdk-provider-gemini-cli@ai-sdk-v4 ai@^4.3.16
Installation
- Install and authenticate the Gemini CLI:
npm install -g @google/gemini-cli
gemini # Follow the interactive authentication setup
- Add the provider to your project:
npm install ai-sdk-provider-gemini-cli ai
Quick Start
import { generateText } from 'ai';
import { createGeminiProvider } from 'ai-sdk-provider-gemini-cli';
const gemini = createGeminiProvider({
authType: 'oauth-personal',
});
const result = await generateText({
model: gemini('gemini-3-pro-preview'),
prompt: 'Write a haiku about coding',
});
console.log(result.text);
Authentication
OAuth (Recommended)
Uses credentials from ~/.gemini/oauth_creds.json created by the Gemini CLI:
const gemini = createGeminiProvider({
authType: 'oauth-personal',
});
API Key
const gemini = createGeminiProvider({
authType: 'api-key',
apiKey: process.env.GEMINI_API_KEY,
});
Get your API key from Google AI Studio.
Supported Models
gemini-3-pro-preview- Latest model with enhanced reasoning (Preview)gemini-3-flash-preview- Fast, efficient model (Preview)gemini-2.5-pro- Previous generation model (64K output tokens)gemini-2.5-flash- Previous generation fast model (64K output tokens)
Features
- Streaming responses
- Tool/function calling
- Structured output with Zod schemas
- Multimodal support (text and base64 images)
- TypeScript support
- Configurable logging
Configuration
const model = gemini('gemini-3-pro-preview', {
temperature: 0.7,
maxOutputTokens: 1000,
topP: 0.95,
});
Logging
// Disable logging
const model = gemini('gemini-3-flash-preview', { logger: false });
// Enable verbose debug logging
const model = gemini('gemini-3-flash-preview', { verbose: true });
// Custom logger
const model = gemini('gemini-3-flash-preview', {
logger: {
debug: (msg) => myLogger.debug(msg),
info: (msg) => myLogger.info(msg),
warn: (msg) => myLogger.warn(msg),
error: (msg) => myLogger.error(msg),
},
});
Examples
See the examples/ directory for comprehensive examples:
check-auth.mjs- Verify authenticationbasic-usage.mjs- Text generationstreaming.mjs- Streaming responsesgenerate-object-basic.mjs- Structured output with Zodtool-calling.mjs- Function calling
npm run build
npm run example:check
npm run example:basic
Breaking Changes
v2.x (AI SDK v6)
- Provider interface: ProviderV2 → ProviderV3
- Token usage: flat → hierarchical structure
- Warning format:
unsupported-setting→unsupported - Method rename:
textEmbeddingModel()→embeddingModel() - Finish reason: string →
{ unified, raw }object
See CHANGELOG.md for details.
Limitations
- Requires Node.js >= 20
- OAuth requires global Gemini CLI installation
- Image URLs not supported (use base64)
- Some parameters not supported:
frequencyPenalty,presencePenalty,seed - Abort signals work but underlying requests continue in background
Disclaimer
This is an unofficial community provider, not affiliated with Google or Vercel. Your data is sent to Google's servers. See Google's Terms of Service.
License
MIT