sentry-ruby icon indicating copy to clipboard operation
sentry-ruby copied to clipboard

feat: implement LLM monitoring with langchainrb integration

Open monotykamary opened this issue 5 months ago • 2 comments

Description

This PR introduces a crude implementation of LLM Monitoring with LangChainrb integration to the Sentry Ruby SDK. The changes include:

  1. Addition of a new monitoring.rb file in the sentry-ruby/lib/sentry/ai/ directory, which implements AI monitoring functionality.
  2. Creation of a langchain.rb file in both sentry-ruby/lib/sentry/ and sentry-ruby/lib/sentry/ai/ directories, providing LangChain integration for the Sentry Ruby SDK.
  3. Potential updates to span.rb and transaction.rb to support these new features.

These changes enhance Sentry's capabilities in monitoring and integrating with AI-related technologies, particularly focusing on LangChain integration.

Current problems

  • Currently can't get it to show on the LLM Monitoring page, but most, if not all the span data are listed in the implementation.

Related Issues/PRs

  • #2406
  • #2405

Refactoring

  • No major refactoring was performed in this PR. All changes are related to new feature additions.

Changelog Entry

Added

  • Introduced AI monitoring capabilities (sentry-ruby/lib/sentry/ai/monitoring.rb)
  • Added LangChain integration (sentry-ruby/lib/sentry/langchain.rb and sentry-ruby/lib/sentry/ai/langchain.rb)
  • Enhanced span and transaction handling to support AI monitoring

Basic Testing:

require 'sentry-ruby'
require 'langchain'
require 'sentry/langchain'

puts "Initializing Sentry..."
Sentry.init do |config|
  config.dsn = ENV['SENTRY_DSN']
  config.traces_sample_rate = 1.0
  config.debug = true # Enable debug mode for more verbose logging
end

Sentry.with_scope do |scope|
  Sentry.set_tags(ai_operation: "Testing")
  
  transaction = Sentry.start_transaction(
    op: "ai.query",
    name: "AI Query Execution"
  )

  Sentry.configure_scope do |scope|
    scope.set_span(transaction)
  end

  begin
    Sentry::AI::Monitoring.ai_track("Testing")
    llm = Langchain::LLM::OpenAI.new(api_key: ENV['OPENAI_API_KEY'])
    result = llm.chat(messages: [{role: "user", content: "testing input"}]).completion
    puts(result)
  rescue => e
    Sentry.capture_exception(e)
    raise e
  ensure
    transaction.finish
  end
end

monotykamary avatar Sep 23 '24 10:09 monotykamary