PromptingTools.jl icon indicating copy to clipboard operation
PromptingTools.jl copied to clipboard

Implement caching support for Anthropic API

Open devin-ai-integration[bot] opened this issue 6 months ago • 0 comments

Implement caching support for Anthropic API

This pull request proposes enhancements to the PromptingTools.jl package by introducing caching support for the Anthropic API wrapper. Caching allows for optimized API usage by minimizing repeated requests and reducing computational load.

Key Changes:

  • Added cache argument to aigenerate function for specifying cache state.
  • Integrated helper functions for caching: _generate_cache_key, _get_from_cache, and _add_to_cache.
  • Updated AIMessage with a meta field to store cache performance statistics.
  • Initialized global caches: SYSTEM_CACHE, TOOLS_CACHE, LAST_CACHE.

Files Modified:

  • llm_anthropic.jl
  • messages.jl

Benefits:

  • Reduces redundant API requests, enhancing efficiency and performance.

This Devin run was requested by Jan.