shopware icon indicating copy to clipboard operation
shopware copied to clipboard

Track performance of APIs

Open mitelg opened this issue 9 months ago • 3 comments

User story

Before we start improving our APIs, we should implement metrics and measurements.

Performance of the API should be improved drastically, as proven by benchmarks. We need metrics for endpoint performance times so we can see if some changes are making APIs slower or faster. Track time from request coming in to response coming out.

Please also research whether we want to use or already have tooling to measure API performance or if we need to build something specifically for our needs.

Acceptance criteria

  • it should be possible to cover different API endpoints without a lot of effort (store API / admin API)
  • invocation times are available for reviewing (API monitoring)

mitelg avatar Mar 25 '25 08:03 mitelg

Why the Missing Infrastructure Is Blocking Us

Currently, we are unable to proceed with tracking and comparing API requests due to missing infrastructure. Specifically, we do not have publicly accessible reference environments running the current Shopware LTS and Nightly versions in the cloud. The privately hosted servers available to us either lack Datadog integration or cannot be used for this purpose.

This absence of standardized, cloud-based environments prevents us from:

  • Running synthetic monitoring tests in a consistent and automated manner.
  • Comparing API metrics between different Shopware versions (e.g., LTS vs. Nightly).
  • Generating and validating OpenAPI schemas across multiple Shopware versions.
  • Ensuring all tests run against environments with an identical demo dataset, which is necessary for reproducible and meaningful results.

Without these reference environments, we cannot establish a reliable baseline for performance and functional monitoring, nor can we validate improvements or regressions in the API across releases. To move forward, we need support in setting up and maintaining these cloud-based reference shops, including Datadog integration and demo data provisioning.

BrocksiNet avatar May 16 '25 06:05 BrocksiNet

outcome of meeting June 5th on the topic: @BrocksiNet will draft requirements for SaaS team based on the brainstorm we had during the meeting and will share with SaaS team in the prep for the meeting with the SaaS team. @BrocksiNet will schedule the meeting. @nfortier-shopware will research how to get real world API data from DataDog.

gianfranco-l avatar Jun 05 '25 13:06 gianfranco-l

this is a case that could/should have been covered had we monitoring in place: https://shopware-ag.slack.com/archives/C080HM3C85R/p1750149247340489

keulinho avatar Jun 17 '25 12:06 keulinho

Feedback from SaaS Team

Hey, I am not sure if SaaS is the right fit for this requirement. We usually update shops as fast as possible to the newest releases, + staging has limited resources to make it cost-efficient, which also means performance can vary quite a bit. Depending on your use case, a nightly GitHub action could be worth a try to get a reproducible result.

BrocksiNet avatar Jun 24 '25 13:06 BrocksiNet

it became evident we need a reliable baseline + on-demand continous load testing and related monitoring, this is impacting other initiatives as well. We broke it down to define plan and responsibilities, here

gianfranco-l avatar Jul 10 '25 09:07 gianfranco-l

As the initiative of performance tests is getting addressed on higher level, we will focus for now with this task on gathering production data through a dashboard - which should help further baseline

nfortier-shopware avatar Jul 15 '25 13:07 nfortier-shopware