dd-trace-py
dd-trace-py copied to clipboard
feat(llmobs): capture more model parameters in openai chat/completion spans
This PR captures more model/input parameters in OpenAI chat/completion spans. Currently we only capture temperature and max_tokens, but there are plenty of other parameters that we can capture as part of the request metadata to LLM Observability.
Checklist
- [ ] Change(s) are motivated and described in the PR description
- [ ] Testing strategy is described if automated tests are not included in the PR
- [ ] Risks are described (performance impact, potential for breakage, maintainability)
- [ ] Change is maintainable (easy to change, telemetry, documentation)
- [ ] Library release note guidelines are followed or label
changelog/no-changelogis set - [ ] Documentation is included (in-code, generated user docs, public corp docs)
- [ ] Backport labels are set (if applicable)
- [ ] If this PR changes the public interface, I've notified
@DataDog/apm-tees.
Reviewer Checklist
- [ ] Title is accurate
- [ ] All changes are related to the pull request's stated goal
- [ ] Description motivates each change
- [ ] Avoids breaking API changes
- [ ] Testing strategy adequately addresses listed risks
- [ ] Change is maintainable (easy to change, telemetry, documentation)
- [ ] Release note makes sense to a user of the library
- [ ] Author has acknowledged and discussed the performance implications of this PR as reported in the benchmarks PR comment
- [ ] Backport labels are set in a manner that is consistent with the release branch maintenance policy