feat: add run_config concurrency controls for experiments
Issue Link / Problem Description
- Fixes https://github.com/vibrantlabsai/ragas/issues/2457 (experiment concurrency throttling)
Users running @experiment().arun() couldn’t limit concurrent async tasks to honor provider rate limits (e.g., Azure OpenAI). Unlike evaluate(), there was no RunConfig/max_workers option, so experiment tasks always fired at full concurrency.
Changes Made
- Thread optional
run_config+max_workersthroughExperimentWrapper.arun()and the@experimentdecorator, reusingragas.async_utils.as_completedwith the resolved worker limit. - Add unit tests covering RunConfig-based throttling, explicit overrides, and zero/unlimited coercion.
- Document the new knobs in
docs/concepts/experimentation.mdand the RunConfig how-to.
Testing
How to Test
- [x] Automated tests added/updated
uv run pytest tests/unit/test_experiment.py
- [x] Manual testing steps:
uv run async.py(manual script) before the fix showedrun_configkeyword errors; after the fix it reports the expectedmax concurrentvalues (unlimited, run_config=1, override=3).uv run pytest tests/unit/test_experiment.py -k run_config_max_workersfails on previous commit, passes now.make test(full suite) – all tests pass.
References
- Related issues: https://github.com/vibrantlabsai/ragas/issues/2457
- Documentation:
docs/concepts/experimentation.md,docs/howtos/customizations/_run_config.md - External references: none
Screenshots/Examples (if applicable)
N/A – behavior verified via tests + manual script.
Thanks @dhyaneesh but we're deprecating RunConfig. Will need to think from a fresh perspective as to handling the said config in new architecture.
Will keep this PR open for now.
I’m happy to refactor this to match the new architecture or contribute to a new one, once I understand the preferred approach and or any needs.
Thanks for keeping the PR open, I’ll wait for guidance on the next steps.