feat: add first-class embedding model support for OpenRouter (v4 & v5)
Summary
This PR adds first-class support for embedding models in the OpenRouter provider, compatible with AI SDK v5 and v4. It introduces the OpenRouterEmbeddingModel class for generating embeddings, along with schema validation, API integration, and comprehensive tests. The provider now exposes textEmbeddingModel (v5) and a deprecated embedding alias (v4) for parity with other AI SDK providers.
This implementation enables semantic search, RAG pipelines, and vector-native features using OpenRouter's embedding API, with support for custom routing preferences and user tracking.
Changes
- src/embedding/: New directory with
index.tsimplementing theOpenRouterEmbeddingModelclass,schemas.tsfor response validation using Zod, andindex.test.tswith Vitest tests covering instantiation, single/batch embeddings, custom settings, and edge cases (e.g., missing usage data). - src/facade.ts and src/provider.ts: Extended to expose
textEmbeddingModelandembeddingmethods, creating embedding model instances with configurable settings. - src/types/openrouter-embedding-settings.ts: New types for embedding-specific settings, including provider routing (e.g., order, fallbacks) and user ID.
- src/types/index.ts: Re-exports embedding settings types for easy access.
- README.md: Updated documentation with usage examples for v5/v4, batch embeddings, and a list of supported models.
Usage Examples
For AI SDK v5
import { embed } from 'ai';
import { openrouter } from '@openrouter/ai-sdk-provider';
const { embedding } = await embed({
model: openrouter.textEmbeddingModel('openai/text-embedding-3-small'),
value: 'sunny day at the beach',
});
For batch embeddings:
import { embedMany } from 'ai';
import { openrouter } from '@openrouter/ai-sdk-provider';
const { embeddings } = await embedMany({
model: openrouter.textEmbeddingModel('openai/text-embedding-3-small'),
values: [
'sunny day at the beach',
'rainy day in the city',
'snowy mountain peak',
],
});
For AI SDK v4 (Deprecated)
import { embed } from 'ai';
import { openrouter } from '@openrouter/ai-sdk-provider';
const { embedding } = await embed({
model: openrouter.embedding('openai/text-embedding-3-small'),
value: 'sunny day at the beach',
});
Custom settings example (e.g., provider routing):
const model = openrouter.textEmbeddingModel('openai/text-embedding-3-small', {
user: 'test-user-123',
provider: {
order: ['openai'],
allow_fallbacks: false,
},
});
Tests
- Added 253 lines of tests in
src/embedding/index.test.tscovering:- Provider method exposure (
textEmbeddingModel,embedding). - Model instantiation and configuration.
- Single and batch embedding generation.
- Custom request settings (e.g., user, provider options).
- Handling responses without usage data.
- Provider method exposure (
- Uses mocked fetch responses to simulate API behavior.
References
- OpenRouter Embeddings API: https://openrouter.ai/docs/api-reference/embeddings
- AI SDK v4 Embeddings: https://v4.ai-sdk.dev/docs/reference/ai-sdk-core/embed#embedding
- AI SDK v5 Embeddings: https://ai-sdk.dev/docs/ai-sdk-core/embeddings
- Supported Embedding Models: https://openrouter.ai/models?output_modalities=embeddings (e.g.,
openai/text-embedding-3-small,openai/text-embedding-3-large,openai/text-embedding-ada-002)
It is currently awaiting review/merge from the repository owners.
In the meantime – quick workaround
If you need the feature right now, you can use the branch from my fork:
# Clone my fork and checkout the feature branch
git clone https://github.com/Loule95450/ai-sdk-provider.git
cd ai-sdk-provider
git checkout feat/add-embed-support
# Install & build
pnpm install
pnpm build
# Then install it locally in your project
cd /path/to/your/project
npm i /path/to/ai-sdk-provider-cloned
Hey any updates on when this should be live?
also, we're working on https://github.com/OpenRouterTeam/ai-sdk-provider/tree/next which should be available on the alpha channel soon. we'll have to bring embeddings in there too