genkit
genkit copied to clipboard
feat(js/plugins/vertexai): add support to anthropic prompt caching
Description here... Help the reviewer by:
- close #2885
When using Anthropic models, you can set up cache control for system messages. You can enable this feature by adding a cacheControl property to the custom field of a message. The only currently supported cache type is ephemeral.
For example, to use cache control for a specific system message, you would set it up like this:
const llmResponse = await ai.generate({
model: claude3Sonnet, // or another Anthropic model
messages: [
{
role: 'system',
content: [
{
text: 'This is an important instruction that can be cached.',
custom: {
cacheControl: {
type: 'ephemeral',
},
},
},
],
},
{
role: 'user',
content: [{ text: 'What should I do when I visit Melbourne?' }],
},
],
});
https://github.com/anthropics/anthropic-sdk-typescript/releases/tag/sdk-v0.33.0 https://github.com/anthropics/anthropic-sdk-typescript/pull/631
Checklist (if applicable):
- [x] PR title is following https://www.conventionalcommits.org/en/v1.0.0/
- [x] Tested (manually, unit tested, etc.)
- [x] Docs updated (updated docs or a docs bug required)