sentry-javascript icon indicating copy to clipboard operation
sentry-javascript copied to clipboard

[Javascript] Add LLM monitoring support

Open AbhiPrasad opened this issue 1 year ago • 2 comments

Problem Statement

Add support for https://docs.sentry.io/product/insights/llm-monitoring/ in JavaScript

Solution Brainstorm

Things we can support:

  • https://github.com/getsentry/sentry-javascript/issues/13679
  • Amazon Bedrock: https://github.com/aws/aws-sdk-js-v3/tree/main/clients/client-bedrock
  • Anthropic SDK: https://github.com/anthropics/anthropic-sdk-typescript
  • Azure OpenAI SDK: https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/openai/openai
  • Cohere AI: https://www.npmjs.com/package/cohere-ai
  • Google Vertex: https://github.com/googleapis/google-cloud-node
  • Google Generative AI: https://www.npmjs.com/package/@google-ai/generativelanguage
  • HuggingFace JS: https://github.com/huggingface/huggingface.js/
  • Mistral JS: https://github.com/mistralai/client-js
  • OpenAI JS: https://github.com/openai/openai-node
  • Langchain JS: https://github.com/langchain-ai/langgraphjs

AbhiPrasad avatar Jul 18 '24 17:07 AbhiPrasad

Currently difficult because we need to annotate the LLM spans with the pipeline span's name, and there is no nice way to pass context down in a runtime agnostic way

In python this is done with ContextVar

Maybe we can convert the pipeline span into a transaction, and pull down its name into the children that way?

colin-sentry avatar Jul 22 '24 20:07 colin-sentry

In python this is done with ContextVar

In Node.js we can use AsyncLocalStorage, which is what the SDK uses to maintain parent-child relationship between spans even with async behaviour.

In fact the Sentry scope lives on-top of async local storage in the Node SDK, so you can just store stuff on the scope to use.

AbhiPrasad avatar Jul 22 '24 20:07 AbhiPrasad

Where did this net out? Can I roll it by hand using lower level primitives / patterns via the in the javascript SDK?

staticshock avatar Mar 12 '25 19:03 staticshock

@staticshock we have automatic instrumentation for Vercel's ai library: https://docs.sentry.io/platforms/javascript/guides/node/configuration/integrations/vercelai/

For other libraries, you have to add manual instrumentation. We documented the span conventions you can use for that here: https://develop.sentry.dev/sdk/telemetry/traces/modules/llm-monitoring/

AbhiPrasad avatar Mar 17 '25 14:03 AbhiPrasad