chore(deps-dev): bump the pip group across 1 directory with 26 updates
Updates the requirements on ipykernel, protobuf, sentence-transformers, graphrag-sdk, llama-index, llama-index-core, llama-index-llms-openai, fastapi, litellm, pandas, pytest, pytest-cov, pytest-asyncio, dirty-equals, mkdocs-material, mkdocstrings[python], mkdocs-git-revision-date-localized-plugin, typer, mkdocs-macros-plugin, mkdocs-glightbox, pyyaml, termcolor, mypy, ruff, pre-commit and uv to permit the latest version.
Updates ipykernel from 6.30.1 to 7.1.0
Release notes
Sourced from ipykernel's releases.
v7.1.0
7.1.0
IPykernel 7.1.0 fixes an issue where display outputs such as Matplotlib plots were not included when using
%notebookmagic to save sessions as.ipynbfiles (#1435). This is enabled using the traitletZMQDisplayPublisher.store_display_historywhich defaults to the previous behaviour of False. This is a minor release rather than a patch release due to the addition of the new traitlet.Output from threads is restored to the pre-6.29 behavior by default (route to latest cell, unless
get_ipython().set_parent()is called explicitly from the thread. If it is called, output from that thread will continue to be routed to the same cell). This behavior is now opt-in, instead of unconditional (#1451).This release also fixes bugs that were introduced into the 7.x branch relating to Matplotlib plots in separate windows not being displayed correctly (#1458), kernels launched in new threads failing asserts (#1455), and
ContextVars persisting between cells (#1462). There is also a fix for keyboard interrupts on Windows (#1434).Enhancements made
- Store display outputs in history for
%notebookmagic #1435 (@Darshan808)Bugs fixed
- fix ContextVar persistence across cells #1462 (
@minrk)- Fix matplotlib eventloops #1458 (
@ianthomas23)- Refer to kernel launching thread instead of assuming the main thread #1455 (
@dfalbel)- Fix routing of background thread output when no parent is set explicitly #1451 (
@minrk)- Fix KeyboardInterrupt on Windows by manually resetting interrupt event #1434 (
@ptosco)Maintenance and upkeep improvements
- update pre-commit and related #1465 (
@Carreau)- test that matplotlib event loop integration is responsive #1463 (
@minrk)- update tests for 3.14 #1453 (
@minrk)Contributors to this release
(GitHub contributors page for this release)
@Carreau|@Darshan808|@dfalbel|@ianthomas23|@krassowski|@lumberbot-app|@minrk|@ptoscov7.0.1
7.0.1
IPykernel 7.0.1 is a bug fix release to support CPython 3.14.
Bugs fixed
- Avoid overriding Thread._context in Python 3.14 #1447 (
@ianthomas23)
... (truncated)
Changelog
Sourced from ipykernel's changelog.
7.1.0
IPykernel 7.1.0 fixes an issue where display outputs such as Matplotlib plots were not included when using
%notebookmagic to save sessions as.ipynbfiles (#1435). This is enabled using the traitletZMQDisplayPublisher.store_display_historywhich defaults to the previous behaviour of False. This is a minor release rather than a patch release due to the addition of the new traitlet.Output from threads is restored to the pre-6.29 behavior by default (route to latest cell, unless
get_ipython().set_parent()is called explicitly from the thread. If it is called, output from that thread will continue to be routed to the same cell). This behavior is now opt-in, instead of unconditional (#1451).This release also fixes bugs that were introduced into the 7.x branch relating to Matplotlib plots in separate windows not being displayed correctly (#1458), kernels launched in new threads failing asserts (#1455), and
ContextVars persisting between cells (#1462). There is also a fix for keyboard interrupts on Windows (#1434).Enhancements made
- Store display outputs in history for
%notebookmagic #1435 (@Darshan808)Bugs fixed
- fix ContextVar persistence across cells #1462 (
@minrk)- Fix matplotlib eventloops #1458 (
@ianthomas23)- Refer to kernel launching thread instead of assuming the main thread #1455 (
@dfalbel)- Fix routing of background thread output when no parent is set explicitly #1451 (
@minrk)- Fix KeyboardInterrupt on Windows by manually resetting interrupt event #1434 (
@ptosco)Maintenance and upkeep improvements
- update pre-commit and related #1465 (
@Carreau)- test that matplotlib event loop integration is responsive #1463 (
@minrk)- update tests for 3.14 #1453 (
@minrk)Contributors to this release
(GitHub contributors page for this release)
@Carreau|@Darshan808|@dfalbel|@ianthomas23|@krassowski|@lumberbot-app|@minrk|@ptosco7.0.1
IPykernel 7.0.1 is a bug fix release to support CPython 3.14.
Bugs fixed
- Avoid overriding Thread._context in Python 3.14 #1447 (
@ianthomas23)Maintenance and upkeep improvements
- Fix 7.x license warnings #1448 (
@bollwyvl)- ci: Test on PyPy 3.11 instead of 3.10 #1444 (
@cclauss)
... (truncated)
Commits
39eaf96Publish 7.1.06f61a68test that matplotlib event loop integration is responsive (#1463)8446e02Fix KeyboardInterrupt on Windows by manually resetting interrupt event (#1434)dd1e094update pre-commit and related (#1465)95f2451fix ContextVar persistence across cells (#1462)c56a7aaFix matplotlib eventloops (#1458)c7af34cRefer to kernel laucnhing thread instead of assuming the main thread (#1455)7193d14Fix routing of background thread output when no parent is set explicitly (#1451)b8f5dfcStore display outputs in history for%notebookmagic (#1435)93f11dbupdate tests for 3.14 (#1453)- Additional commits viewable in compare view
Updates protobuf from 6.32.0 to 6.33.1
Commits
- See full diff in compare view
Updates sentence-transformers to 5.1.2
Release notes
Sourced from sentence-transformers's releases.
v5.1.2 - Sentence Transformers joins Hugging Face; model saving/loading improvements and loss compatibility
This patch celebrates the transition of Sentence Transformers to Hugging Face, and improves model saving, loading defaults, and loss compatibilities.
Install this version with
# Training + Inference pip install sentence-transformers[train]==5.1.2Inference only, use one of:
pip install sentence-transformers==5.1.2 pip install sentence-transformers[onnx-gpu]==5.1.2 pip install sentence-transformers[onnx]==5.1.2 pip install sentence-transformers[openvino]==5.1.2
Sentence Transformers is joining Hugging Face!
Today, Sentence Transformers is moving from the Ubiquitous Knowledge Processing (UKP) Lab at Technische Universität Darmstadt to Hugging Face. This formalizes the existing maintenance structure, as Tom Aarsen (that's me!) from Hugging Face has been maintaining the project for the past two years. The project's development roadmap, license, support, and commitment to the community remain unchanged. Read the full announcement for more details!
Minor changes re. saving and loading
- Improve saving models with StaticEmbedding (#3524) and Dense (#3528) modules.
- Fix training with CPU when "stronger" devices (CUDA, MPS) are available (#3525)
- Default to 'xpu' device over 'cpu' if the former is available (#3537)
Minor changes re. losses
- Change errors/warnings for MatryoshkaLoss to prevent easy-to-make mistakes, e.g. forgetting to use the original dimension (#3530)
- Introduce compatibility between MSELoss and MatryoshkaLoss (#3538)
- Also use mini-batches for positives with MegaBatchMarginLoss (#3550)
All Changes
- fix: static model saving fails because weights are not contiguous by
@stephantulin UKPLab/sentence-transformers#3524- bug: fix state bug in trainer by
@stephantulin UKPLab/sentence-transformers#3525- [
fix] Patch Router training with Cached losses by@tomaarsenin UKPLab/sentence-transformers#3527- [
fix] Allow loading Dense modules not saved in fp32 by@tomaarsenin UKPLab/sentence-transformers#3528- [
tests] Patch Regex expected output for Python 3.9 by@tomaarsenin UKPLab/sentence-transformers#3529- Future-proof tests for upcoming transformers version by
@tomaarsenin UKPLab/sentence-transformers#3533- feat: matryoshka dims bounds checks and warning by
@stephantulin UKPLab/sentence-transformers#3530- Performance table format and Few markdownlint error correction by
@kaushikacharyain UKPLab/sentence-transformers#3534- fix typos: inaccuracte -> inaccurate by
@whybe-choiin UKPLab/sentence-transformers#3542- [
fix] correct dataset link in training scripts by@thomasht86in UKPLab/sentence-transformers#3543- [
fix] Allow MatryoshkaLoss with MSELoss by@tomaarsenin UKPLab/sentence-transformers#3538- [
docs] Format all markdown using mdformat by@tomaarsenin UKPLab/sentence-transformers#3539- feat: provide support for python 3.13 by
@lordoffreaksin UKPLab/sentence-transformers#3551- Update MTEB docs to v2 by
@KennethEnevoldsenin UKPLab/sentence-transformers#3548- Add support for Intel 'xpu' device in environment.py by
@domschlin UKPLab/sentence-transformers#3537- docs: Update project metadata by
@tomaarsenin huggingface/sentence-transformers#3553- MegaBatchMarginLoss use mini batches for positives too by
@benHeidin huggingface/sentence-transformers#3550
... (truncated)
Commits
dfcbbd9Release v5.1.29cfa9adMerge branch 'master' into v5.1-release85ec645MegaBatchMarginLossuse mini batches for positives too (#3550)8e4c85bdocs: Update project metadata (#3553)2220062Add support for Intel 'xpu' device in environment.py (#3537)315a67dUpdate MTEB docs to v2 (#3548)471387echore: add python 3.13 to CI and classifiers (#3551)77ae248docs: Format all markdown using mdformat (#3539)b84cc32[fix] Allow MatryoshkaLoss with MSELoss (#3538)051ce1aCorrect link in example training scripts (#3543)- Additional commits viewable in compare view
Updates graphrag-sdk from 0.8.0 to 0.8.1
Release notes
Sourced from graphrag-sdk's releases.
v0.8.1
What's Changed
- Fix poetry and pip file by
@Naseem77in FalkorDB/GraphRAG-SDK#122- Add refresh_ontology() method to kg by
@Naseem77in FalkorDB/GraphRAG-SDK#126- fix: upgrage pypdf version by
@priyansh4320in FalkorDB/GraphRAG-SDK#120- update README by
@Naseem77in FalkorDB/GraphRAG-SDK#125- avoid deepcopy exception with OllamaGenerativeModel by
@gsw945in FalkorDB/GraphRAG-SDK#128- Update Ollama Usage - Arguments & Readme by
@galshubeliin FalkorDB/GraphRAG-SDK#129- fix
ValueErrorinAttributeType.from_string()by@gsw945in FalkorDB/GraphRAG-SDK#130- Fixed typo in prompts by
@Tesla2000in FalkorDB/GraphRAG-SDK#131New Contributors
@Naseem77made their first contribution in FalkorDB/GraphRAG-SDK#122@priyansh4320made their first contribution in FalkorDB/GraphRAG-SDK#120@gsw945made their first contribution in FalkorDB/GraphRAG-SDK#128@Tesla2000made their first contribution in FalkorDB/GraphRAG-SDK#131Full Changelog: https://github.com/FalkorDB/GraphRAG-SDK/compare/v0.8.0...v0.8.1
Commits
bee24abMerge pull request #131 from Tesla2000/main9421b84Fixed typo in promptsbb0c54aMerge pull request #130 from gsw945/gsw945-patch-2c72afa9Remove the 'AttributeType.' prefix to avoid 'Invalid attribute type' error.9c30c23Merge pull request #129 from FalkorDB/olllama-usage8600b06update-lock-file20cc910Merge branch 'main' into olllama-usage7b03304update-ollama-usage13fc340Merge pull request #128 from gsw945/patch-106ff2cdMerge branch 'main' into patch-1- Additional commits viewable in compare view
Updates llama-index to 0.14.8
Release notes
Sourced from llama-index's releases.
v0.14.8
Release Notes
[2025-11-10]
llama-index-core [0.14.8]
- Fix ReActOutputParser getting stuck when "Answer:" contains "Action:" (#20098)
- Add buffer to image, audio, video and document blocks (#20153)
- fix(agent): Handle multi-block ChatMessage in ReActAgent (#20196)
- Fix/20209 (#20214)
- Preserve Exception in ToolOutput (#20231)
- fix weird pydantic warning (#20235)
llama-index-embeddings-nvidia [0.4.2]
- docs: Edit pass and update example model (#20198)
llama-index-embeddings-ollama [0.8.4]
- Added a test case (no code) to check the embedding through an actual connection to a Ollama server (after checking that the ollama server exists) (#20230)
llama-index-llms-anthropic [0.10.2]
- feat(llms/anthropic): Add support for RawMessageDeltaEvent in streaming (#20206)
- chore: remove unsupported models (#20211)
llama-index-llms-bedrock-converse [0.11.1]
- feat: integrate bedrock converse with tool call block (#20099)
- feat: Update model name extraction to include 'jp' region prefix and … (#20233)
llama-index-llms-google-genai [0.7.3]
- feat: google genai integration with tool block (#20096)
- fix: non-streaming gemini tool calling (#20207)
- Add token usage information in GoogleGenAI chat additional_kwargs (#20219)
- bug fix google genai stream_complete (#20220)
llama-index-llms-nvidia [0.4.4]
- docs: Edit pass and code example updates (#20200)
llama-index-llms-openai [0.6.8]
- FixV2: Correct DocumentBlock type for OpenAI from 'input_file' to 'file' (#20203)
- OpenAI v2 sdk support (#20234)
llama-index-llms-upstage [0.6.5]
... (truncated)
Changelog
Sourced from llama-index's changelog.
llama-index-core [0.14.8]
- Fix ReActOutputParser getting stuck when "Answer:" contains "Action:" (#20098)
- Add buffer to image, audio, video and document blocks (#20153)
- fix(agent): Handle multi-block ChatMessage in ReActAgent (#20196)
- Fix/20209 (#20214)
- Preserve Exception in ToolOutput (#20231)
- fix weird pydantic warning (#20235)
llama-index-embeddings-nvidia [0.4.2]
- docs: Edit pass and update example model (#20198)
llama-index-embeddings-ollama [0.8.4]
- Added a test case (no code) to check the embedding through an actual connection to a Ollama server (after checking that the ollama server exists) (#20230)
llama-index-llms-anthropic [0.10.2]
- feat(llms/anthropic): Add support for RawMessageDeltaEvent in streaming (#20206)
- chore: remove unsupported models (#20211)
llama-index-llms-bedrock-converse [0.11.1]
- feat: integrate bedrock converse with tool call block (#20099)
- feat: Update model name extraction to include 'jp' region prefix and … (#20233)
llama-index-llms-google-genai [0.7.3]
- feat: google genai integration with tool block (#20096)
- fix: non-streaming gemini tool calling (#20207)
- Add token usage information in GoogleGenAI chat additional_kwargs (#20219)
- bug fix google genai stream_complete (#20220)
llama-index-llms-nvidia [0.4.4]
- docs: Edit pass and code example updates (#20200)
llama-index-llms-openai [0.6.8]
- FixV2: Correct DocumentBlock type for OpenAI from 'input_file' to 'file' (#20203)
- OpenAI v2 sdk support (#20234)
llama-index-llms-upstage [0.6.5]
- OpenAI v2 sdk support (#20234)
llama-index-packs-streamlit-chatbot [0.5.2]
- OpenAI v2 sdk support (#20234)
... (truncated)
Commits
bc52c85Release 0.14.8 (#20236)1a960bePreserve Exception in ToolOutput (#20231)60d102dfix weird pydantic warning (#20235)575a14cAdded a test case (no code) to check the embedding through an actual connecti...8d4680fUpdate Scrapy dependency to 2.13.3 (#20228)1c8566bUpdate llama-index-core dependency to 0.12.45 (#20227)aaf94f8fix: Ensure schema creation only occurs if it doesn't already exist (#20225)67b198ffeat: Update model name extraction to include 'jp' region prefix and … (#20233)74c7204OpenAI v2 sdk support (#20234)fdc676afeat(llms/anthropic): Add support for RawMessageDeltaEvent in streaming (#20206)- Additional commits viewable in compare view
Updates llama-index-core to 0.14.8
Release notes
Sourced from llama-index-core's releases.
v0.14.8
Release Notes
[2025-11-10]
llama-index-core [0.14.8]
- Fix ReActOutputParser getting stuck when "Answer:" contains "Action:" (#20098)
- Add buffer to image, audio, video and document blocks (#20153)
- fix(agent): Handle multi-block ChatMessage in ReActAgent (#20196)
- Fix/20209 (#20214)
- Preserve Exception in ToolOutput (#20231)
- fix weird pydantic warning (#20235)
llama-index-embeddings-nvidia [0.4.2]
- docs: Edit pass and update example model (#20198)
llama-index-embeddings-ollama [0.8.4]
- Added a test case (no code) to check the embedding through an actual connection to a Ollama server (after checking that the ollama server exists) (#20230)
llama-index-llms-anthropic [0.10.2]
- feat(llms/anthropic): Add support for RawMessageDeltaEvent in streaming (#20206)
- chore: remove unsupported models (#20211)
llama-index-llms-bedrock-converse [0.11.1]
- feat: integrate bedrock converse with tool call block (#20099)
- feat: Update model name extraction to include 'jp' region prefix and … (#20233)
llama-index-llms-google-genai [0.7.3]
- feat: google genai integration with tool block (#20096)
- fix: non-streaming gemini tool calling (#20207)
- Add token usage information in GoogleGenAI chat additional_kwargs (#20219)
- bug fix google genai stream_complete (#20220)
llama-index-llms-nvidia [0.4.4]
- docs: Edit pass and code example updates (#20200)
llama-index-llms-openai [0.6.8]
- FixV2: Correct DocumentBlock type for OpenAI from 'input_file' to 'file' (#20203)
- OpenAI v2 sdk support (#20234)
llama-index-llms-upstage [0.6.5]
... (truncated)
Changelog
Sourced from llama-index-core's changelog.
llama-index-core [0.14.8]
- Fix ReActOutputParser getting stuck when "Answer:" contains "Action:" (#20098)
- Add buffer to image, audio, video and document blocks (#20153)
- fix(agent): Handle multi-block ChatMessage in ReActAgent (#20196)
- Fix/20209 (#20214)
- Preserve Exception in ToolOutput (#20231)
- fix weird pydantic warning (#20235)
llama-index-embeddings-nvidia [0.4.2]
- docs: Edit pass and update example model (#20198)
llama-index-embeddings-ollama [0.8.4]
- Added a test case (no code) to check the embedding through an actual connection to a Ollama server (after checking that the ollama server exists) (#20230)
llama-index-llms-anthropic [0.10.2]
- feat(llms/anthropic): Add support for RawMessageDeltaEvent in streaming (#20206)
- chore: remove unsupported models (#20211)
llama-index-llms-bedrock-converse [0.11.1]
- feat: integrate bedrock converse with tool call block (#20099)
- feat: Update model name extraction to include 'jp' region prefix and … (#20233)
llama-index-llms-google-genai [0.7.3]
- feat: google genai integration with tool block (#20096)
- fix: non-streaming gemini tool calling (#20207)
- Add token usage information in GoogleGenAI chat additional_kwargs (#20219)
- bug fix google genai stream_complete (#20220)
llama-index-llms-nvidia [0.4.4]
- docs: Edit pass and code example updates (#20200)
llama-index-llms-openai [0.6.8]
- FixV2: Correct DocumentBlock type for OpenAI from 'input_file' to 'file' (#20203)
- OpenAI v2 sdk support (#20234)
llama-index-llms-upstage [0.6.5]
- OpenAI v2 sdk support (#20234)
llama-index-packs-streamlit-chatbot [0.5.2]
- OpenAI v2 sdk support (#20234)
... (truncated)
Commits
bc52c85Release 0.14.8 (#20236)1a960bePreserve Exception in ToolOutput (#20231)60d102dfix weird pydantic warning (#20235)575a14cAdded a test case (no code) to check the embedding through an actual connecti...8d4680fUpdate Scrapy dependency to 2.13.3 (#20228)1c8566bUpdate llama-index-core dependency to 0.12.45 (#20227)aaf94f8fix: Ensure schema creation only occurs if it doesn't already exist (#20225)67b198ffeat: Update model name extraction to include 'jp' region prefix and … (#20233)74c7204OpenAI v2 sdk support (#20234)fdc676afeat(llms/anthropic): Add support for RawMessageDeltaEvent in streaming (#20206)- Additional commits viewable in compare view
Updates llama-index-llms-openai to 0.6.9
Updates fastapi from 0.116.1 to 0.121.2
Release notes
Sourced from fastapi's releases.
0.121.2
Fixes
- 🐛 Fix handling of JSON Schema attributes named "$ref". PR #14349 by
@tiangolo.Docs
- 📝 Add EuroPython talk & podcast episode with Sebastián Ramírez. PR #14260 by
@clytaemnestra.- ✏️ Fix links and add missing permalink in docs. PR #14217 by
@YuriiMotov.Translations
- 🌐 Update Portuguese translations with LLM prompt. PR #14228 by
@ceb10n.- 🔨 Add Portuguese translations LLM prompt. PR #14208 by
@ceb10n.- 🌐 Sync Russian docs. PR #14331 by
@YuriiMotov.- 🌐 Sync German docs. PR #14317 by
@nilslindemann.0.121.1
Fixes
- 🐛 Fix
Depends(func, scope='function')for top level (parameterless) dependencies. PR #14301 by@luzzodev.Docs
- 📝 Upate docs for advanced dependencies with
yield, noting the changes in 0.121.0, addingscope. PR #14287 by@tiangolo.Internal
- ⬆ Bump ruff from 0.13.2 to 0.14.3. PR #14276 by
@dependabot[bot].- ⬆ [pre-commit.ci] pre-commit autoupdate. PR #14289 by
@pre-commit-ci[bot].0.121.0
Features
- ✨ Add support for dependencies with scopes, support
scope="request"for dependencies withyieldthat exit before the response is sent. PR #14262 by@tiangolo.Internal
- 👥 Update FastAPI People - Contributors and Translators. PR #14273 by
@tiangolo.- 👥 Update FastAPI People - Sponsors. PR #14274 by
@tiangolo.- 👥 Update FastAPI GitHub topic repositories. PR #14280 by
@tiangolo.- ⬆ Bump mkdocs-macros-plugin from 1.4.0 to 1.4.1. PR #14277 by
@dependabot[bot].- ⬆ Bump mkdocstrings[python] from 0.26.1 to 0.30.1. PR #14279 by
@dependabot[bot].0.120.4
Fixes
- 🐛 Fix security schemes in OpenAPI when added at the top level app. PR #14266 by
@YuriiMotov.
... (truncated)
Commits
02e108d🔖 Release version 0.121.2d3b7597📝 Update release notes5d40dfb🐛 Fix handling of JSON Schema attributes named "$ref" (#14349)eaf611f📝 Update release notes004ab1a📝 Add EuroPython talk & podcast episode with Sebastián Ramírez (#14260)d1be85c📝 Update release notes42930fe✏️ Fix links and add missing permalink in docs (#14217)9e362d9📝 Update release notes540a83d🌐 Update Portuguese translations with LLM prompt (#14228)1a2e415📝 Update release notes- Additional commits viewable in compare view
Updates litellm to 1.79.3
Commits
- See full diff in compare view
Updates pandas from 2.3.2 to 2.3.3
Release notes
Sourced from pandas's releases.
Pandas 2.3.3
We are pleased to announce the release of pandas 2.3.3. This release includes some improvements and fixes to the future string data type (preview feature for the upcoming pandas 3.0). We recommend that all users upgrade to this version.
See the full whatsnew for a list of all the changes. Pandas 2.3.3 supports Python 3.9 and higher, and is the first release to support Python 3.14.
The release will be available on the conda-forge channel:
conda install pandas --channel conda-forgeOr via PyPI:
python3 -m pip install --upgrade pandasPlease report any issues with the release on the pandas issue tracker.
Thanks to all the contributors who made this release possible.
Commits
9c8bc3eRLS: 2.3.36aa788a[backport 2.3.x] DOC: prepare 2.3.3 whatsnew notes for release (#62499) (#62508)b64f0df[backport 2.3.x] BUG: avoid validation error for ufunc with string[python] ar...058eb2b[backport 2.3.x] BUG: String[pyarrow] comparison with mixed object (#62424) (...2ca088d[backport 2.3.x] DEPR: remove the Period resampling deprecation (#62480) (#62...92bf98f[backport 2.3.x] BUG: fix .str.isdigit to honor unicode superscript for older...e57c7d6Backport PR #62452 on branch 2.3.x (TST: Adjust tests for numexpr 2.13) (#62454)e0fe9a0Backport to 2.3.x: REGR: from_records not initializing subclasses properly (#...23a1085BUG: improve future warning for boolean operations with missaligned indexes (...6113696Backport PR #62396 on branch 2.3.x (PKG/DOC: indicate Python 3.14 support in ...- Additional commits viewable in compare view
Updates pytest from 8.4.2 to 9.0.1
Release notes
Sourced from pytest's releases.
9.0.1
pytest 9.0.1 (2025-11-12)
Bug fixes
- #13895: Restore support for skipping tests via
raise unittest.SkipTest.- #13896: The terminal progress plugin added in pytest 9.0 is now automatically disabled when iTerm2 is detected, it generated desktop notifications instead of the desired functionality.
- #13904: Fixed the TOML type of the verbosity settings in the API reference from number to string.
- #13910: Fixed UserWarning: Do not expect file_or_dir on some earlier Python 3.12 and 3.13 point versions.
Packaging updates and notes for downstreams
- #13933: The tox configuration has been adjusted to make sure the desired version string can be passed into its
package_envthrough theSETUPTOOLS_SCM_PRETEND_VERSION_FOR_PYTESTenvironment variable as a part of the release process -- bywebknjaz.Contributor-facing changes
Joggr analysis skipped - Bot user
Codecov Report
:white_check_mark: All modified and coverable lines are covered by tests.
:exclamation: There is a different number of reports uploaded between BASE (9ffd829) and HEAD (ce03458). Click for more details.
HEAD has 879 uploads less than BASE
Flag BASE (9ffd829) HEAD (ce03458) 3.13 84 13 core-without-llm 24 0 ubuntu-latest 116 23 3.10 92 18 macos-latest 114 20 3.12 72 11 3.11 96 17 windows-latest 114 16 docs 12 0 optional-deps 12 0 gpt-assistant-agent 6 3 teachable 6 4 llama-index-agent 6 2 gemini 24 0 retrievechat 12 6 retrievechat-qdrant 12 6 long-context 6 0 lmm 6 4 agent-eval 2 1 anthropic 24 0 retrievechat-couchbase 4 2 retrievechat-mongodb 4 2 retrievechat-pgvector 4 2 cerebras 24 10 mistral 24 0 websurfer 24 12 together 24 1 bedrock 24 4 ollama 24 0 swarm 24 11 groq 24 1 cohere 24 0
see 250 files with indirect coverage changes
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
- :package: JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.
Codecov Report
:white_check_mark: All modified and coverable lines are covered by tests.
:exclamation: There is a different number of reports uploaded between BASE (9ffd829) and HEAD (ce03458). Click for more details.
HEAD has 586 uploads less than BASE
Flag BASE (9ffd829) HEAD (ce03458) 3.13 84 37 core-without-llm 24 0 ubuntu-latest 116 55 3.10 92 40 macos-latest 114 51 3.12 72 33 3.11 96 47 windows-latest 114 51 docs 12 0 optional-deps 12 0 gpt-assistant-agent 6 3 teachable 6 4 llama-index-agent 6 3 gemini 24 12 retrievechat 12 6 retrievechat-qdrant 12 6 long-context 6 3 lmm 6 4 agent-eval 2 1 anthropic 24 12 retrievechat-couchbase 4 2 retrievechat-mongodb 4 2 retrievechat-pgvector 4 2 cerebras 24 12 mistral 24 12 websurfer 24 12 together 24 12 bedrock 24 12 ollama 24 12 swarm 24 12 groq 24 12 cohere 24 12
see 245 files with indirect coverage changes
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
- :package: JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.
Dependabot tried to update this pull request, but something went wrong. We're looking into it, but in the meantime you can retry the update by commenting @dependabot recreate.
Looks like these dependencies are updatable in another way, so this is no longer needed.