qubes-core-admin
qubes-core-admin copied to clipboard
Remove preloaded disposables on test tearDown
If it is not done on the test, it will be handled by "qubes/tests/init.py", which will attempt to kill the domain. If the preloaded disposable was still starting, exceptions will be handled by also attempting to kill the domain. Both methods will trigger the "_bare_cleanup()", sometimes indirectly via "cleanup()" or "_auto_cleanup()", but if "_bare_cleanup()" happens on the other call, not the one that called kill, it will not await and will attempt to remove the domain from the disk while it is still running (not completely killed).
For: https://github.com/QubesOS/qubes-core-admin/pull/742 For: https://github.com/QubesOS/qubes-issues/issues/1512 Fixes: https://github.com/qubesos/qubes-issues/issues/10369
Noticed an error and OpenQA is in progress but dispvm job hasn't started yet. Do you think it can upload a new package and update in the meantime to be used on the same test and not have to wait for another one?
https://openqa.qubes-os.org/tests/158404
PipelineRetryFailed
Codecov Report
:x: Patch coverage is 22.22222% with 14 lines in your changes missing coverage. Please review.
:white_check_mark: Project coverage is 70.36%. Comparing base (335c908) to head (f1cbf21).
:warning: Report is 12 commits behind head on main.
| Files with missing lines | Patch % | Lines |
|---|---|---|
| qubes/vm/dispvm.py | 23.52% | 13 Missing :warning: |
| qubes/vm/mix/dvmtemplate.py | 0.00% | 1 Missing :warning: |
Additional details and impacted files
@@ Coverage Diff @@
## main #748 +/- ##
==========================================
- Coverage 70.43% 70.36% -0.08%
==========================================
Files 61 61
Lines 13755 13751 -4
==========================================
- Hits 9689 9676 -13
- Misses 4066 4075 +9
| Flag | Coverage Δ | |
|---|---|---|
| unittests | 70.36% <22.22%> (-0.08%) |
:arrow_down: |
Flags with carried forward coverage won't be shown. Click here to find out more.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
Nope, too late, will be in another run.
OpenQA test summary
Complete test suite and dependencies: https://openqa.qubes-os.org/tests/overview?distri=qubesos&version=4.3&build=2025110513-4.3&flavor=pull-requests
Test run included the following:
- https://github.com/QubesOS/qubes-app-linux-usb-proxy/pull/51 (https://github.com/QubesOS/qubes-app-linux-usb-proxy/pull/51/commits/c643e601b9864517a4c4c1531721501ab961157c)
- https://github.com/QubesOS/qubes-core-admin-client/pull/332 (https://github.com/QubesOS/qubes-core-admin-client/pull/332/commits/40487bcb59ff4a704248144f4783ce0b0e1532b8)
- https://github.com/QubesOS/qubes-core-admin-client/pull/390 (https://github.com/QubesOS/qubes-core-admin-client/pull/390/commits/7ba9e48da005dcebc4b195b9e4c0983320d5a2d5)
- https://github.com/QubesOS/qubes-core-admin-client/pull/393 (https://github.com/QubesOS/qubes-core-admin-client/pull/393/commits/1510d25a75b01dff82c50aa72f1bf7ebb83b1f07)
- https://github.com/QubesOS/qubes-core-admin-client/pull/394 (https://github.com/QubesOS/qubes-core-admin-client/pull/394/commits/537080349eba19688db17292d3860d9123e89759)
- https://github.com/QubesOS/qubes-core-admin/pull/733 (https://github.com/QubesOS/qubes-core-admin/pull/733/commits/67b8fcb6b7116e987c2272e93d2a9b513764b7e5)
- https://github.com/QubesOS/qubes-core-admin/pull/736 (https://github.com/QubesOS/qubes-core-admin/pull/736/commits/a3fc21c156442bb42fa5e13c994c3dfb51ba5e1c)
- https://github.com/QubesOS/qubes-core-admin/pull/747 (https://github.com/QubesOS/qubes-core-admin/pull/747/commits/c139c35c38603f2c7b79d914d546ff49a3b87677)
- https://github.com/QubesOS/qubes-core-admin/pull/748 (https://github.com/QubesOS/qubes-core-admin/pull/748/commits/f1cbf21598178959708045125f95e261bcb3772a)
- https://github.com/QubesOS/qubes-core-admin/pull/749 (https://github.com/QubesOS/qubes-core-admin/pull/749/commits/fd8dc06a59a72e5c05c82c6c42d77fd77f834cd0)
- https://github.com/QubesOS/qubes-core-agent-linux/pull/604 (https://github.com/QubesOS/qubes-core-agent-linux/pull/604/commits/659e8e620ac8d0191f65a68db2058e4c6ee2a09d)
- https://github.com/QubesOS/qubes-core-agent-linux/pull/620 (https://github.com/QubesOS/qubes-core-agent-linux/pull/620/commits/3d75a220a693eb8489d0e4c57c82628a158c2c20)
- https://github.com/QubesOS/qubes-desktop-linux-i3-settings-qubes/pull/21 (https://github.com/QubesOS/qubes-desktop-linux-i3-settings-qubes/pull/21/commits/c2bb4f196c50fbfdd57bfe2e4ae93319341c19f6)
- https://github.com/QubesOS/qubes-gui-agent-linux/pull/248 (https://github.com/QubesOS/qubes-gui-agent-linux/pull/248/commits/e50583a711e75139a07ead2eb0a06c4803fde26a)
- https://github.com/QubesOS/qubes-mgmt-salt-dom0-virtual-machines/pull/86 (https://github.com/QubesOS/qubes-mgmt-salt-dom0-virtual-machines/pull/86/commits/d5806deaef3de18f561de3437a2703ccba4712e8)
Upload failures
-
system_tests_dispvm
-
system_tests: Fail (unknown)
Tests qubes.tests.integ.dispvm failed (exit code 1), details report... -
system_tests: Failed (test died)
# Test died: command 'curl --form [email protected]...
-
-
system_tests_basic_vm_qrexec_gui_ext4
- system_tests: Failed (test died)
# Test died: command 'curl --form [email protected] --form up...
- system_tests: Failed (test died)
New failures, excluding unstable
Compared to: https://openqa.qubes-os.org/tests/overview?distri=qubesos&version=4.3&build=2025081011-4.3&flavor=update
-
system_tests_whonix
-
system_tests_extra
-
TC_00_QVCTest_whonix-gateway-18: test_010_screenshare (failure)
AssertionError: 1 != 0 : Timeout waiting for /dev/video0 in test-in... -
TC_00_QVCTest_whonix-workstation-18: test_010_screenshare (failure)
AssertionError: 1 != 0 : Timeout waiting for /dev/video0 in test-in...
-
-
system_tests_usbproxy
-
TC_30_USBProxy_core3: test_010_on_qdb_change_multiple_assignments_including_full (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_011_on_qdb_change_multiple_assignments_port_vs_dev (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_012_on_qdb_change_multiple_assignments_dev (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_020_on_startup_multiple_assignments_including_full (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_021_on_startup_multiple_assignments_port_vs_dev (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_022_on_startup_multiple_assignments_dev (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_023_on_startup_already_attached (error)
AttributeError: 'TestVM' object has no attribute 'uuid'
-
-
system_tests_whonix@hw7
-
system_tests_qwt_win10_seamless@hw13
- windows_install: Failed (test died)
# Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...
- windows_install: Failed (test died)
-
system_tests_qwt_win11@hw13
-
windows_install: wait_serial (wait serial expected)
# wait_serial expected: qr/iDVvW-\d+-/... -
windows_install: Failed (test died + timed out)
# Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...
-
-
system_tests_dispvm_perf@hw7
-
TC_00_DispVMPerf_whonix-workstation-18: test_002_vm_dispvm_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_008_dom0_dispvm_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_009_dom0_dispvm_gui_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_402_dom0_dispvm_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_409_dom0_dispvm_preload_gui_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_900_reader (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf_reader.py --templ...
-
-
system_tests_guivm_vnc_gui_interactive
- clipboard_and_web: unnamed test (unknown)
- clipboard_and_web: Failed (test died)
# Test died: no candidate needle with tag(s) 'personal-firefox' mat...
Failed tests
24 failures
-
system_tests_whonix
-
system_tests_extra
-
TC_00_QVCTest_whonix-gateway-18: test_010_screenshare (failure)
AssertionError: 1 != 0 : Timeout waiting for /dev/video0 in test-in... -
TC_00_QVCTest_whonix-workstation-18: test_010_screenshare (failure)
AssertionError: 1 != 0 : Timeout waiting for /dev/video0 in test-in...
-
-
system_tests_usbproxy
-
TC_30_USBProxy_core3: test_010_on_qdb_change_multiple_assignments_including_full (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_011_on_qdb_change_multiple_assignments_port_vs_dev (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_012_on_qdb_change_multiple_assignments_dev (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_020_on_startup_multiple_assignments_including_full (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_021_on_startup_multiple_assignments_port_vs_dev (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_022_on_startup_multiple_assignments_dev (error)
AttributeError: 'TestVM' object has no attribute 'uuid' -
TC_30_USBProxy_core3: test_023_on_startup_already_attached (error)
AttributeError: 'TestVM' object has no attribute 'uuid'
-
-
system_tests_whonix@hw7
-
system_tests_qwt_win10_seamless@hw13
- windows_install: Failed (test died)
# Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...
- windows_install: Failed (test died)
-
system_tests_qwt_win11@hw13
-
windows_install: wait_serial (wait serial expected)
# wait_serial expected: qr/iDVvW-\d+-/... -
windows_install: Failed (test died + timed out)
# Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...
-
-
system_tests_dispvm_perf@hw7
-
TC_00_DispVMPerf_whonix-workstation-18: test_002_vm_dispvm_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_008_dom0_dispvm_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_009_dom0_dispvm_gui_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_402_dom0_dispvm_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_409_dom0_dispvm_preload_gui_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-18: test_900_reader (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf_reader.py --templ...
-
-
system_tests_guivm_vnc_gui_interactive
- clipboard_and_web: unnamed test (unknown)
- clipboard_and_web: Failed (test died)
# Test died: no candidate needle with tag(s) 'personal-firefox' mat...
Fixed failures
Compared to: https://openqa.qubes-os.org/tests/149225#dependencies
81 fixed
-
system_tests_extra
- TC_00_QVCTest_whonix-workstation-17: test_010_screenshare (failure)
AssertionError: 1 != 0 : Timeout waiting for /dev/video0 in test-in...
- TC_00_QVCTest_whonix-workstation-17: test_010_screenshare (failure)
-
system_tests_dispvm
-
TC_20_DispVM_debian-13-xfce: test_012_preload_low_mem (failure)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^... AssertionError: 1 != 0 -
TC_20_DispVM_debian-13-xfce: test_013_preload_gui (error)
raise KeyError(key)... KeyError: 'disp3723' -
TC_20_DispVM_debian-13-xfce: test_014_preload_nogui (error + cleanup)
raise TimeoutError from exc_val... TimeoutError -
TC_20_DispVM_debian-13-xfce: test_015_preload_race_more (error + cleanup)
raise KeyError(key)... KeyError: 'disp1187' -
TC_20_DispVM_debian-13-xfce: test_016_preload_race_less (failure + cleanup)
^^^^^^^^^^^^^^^^^^^^^^... AssertionError -
TC_20_DispVM_debian-13-xfce: test_017_preload_autostart (error)
raise KeyError(key)... KeyError: 'disp7317' -
TC_20_DispVM_debian-13-xfce: test_018_preload_global (error)
raise KeyError(key)... KeyError: 'disp8572' -
TC_20_DispVM_debian-13-xfce: test_019_preload_refresh (error)
raise KeyError(key)... KeyError: 'disp6425' -
TC_20_DispVM_fedora-42-xfce: test_012_preload_low_mem (failure)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^... AssertionError: 1 != 0 -
TC_20_DispVM_whonix-workstation-17: test_012_preload_low_mem (failure)
~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^... AssertionError: 1 != 0
-
-
system_tests_kde_gui_interactive
-
gui_keyboard_layout: wait_serial (wait serial expected)
# wait_serial expected: "echo -e '[Layout]\nLayoutList=us,de' | sud... -
gui_keyboard_layout: Failed (test died)
# Test died: command 'test "$(cd ~user;ls e1*)" = "$(qvm-run -p wor...
-
-
system_tests_audio
-
system_tests: Fail (unknown)
Tests qubes.tests.integ.audio failed (exit code 1), details reporte... -
system_tests: Failed (test died)
# Test died: Some tests failed at qubesos/tests/system_tests.pm lin... -
TC_20_AudioVM_Pulse_whonix-workstation-17: test_223_audio_play_hvm (error)
qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco... -
TC_20_AudioVM_Pulse_whonix-workstation-17: test_224_audio_rec_muted_hvm (error)
qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco... -
TC_20_AudioVM_Pulse_whonix-workstation-17: test_225_audio_rec_unmuted_hvm (error)
qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco... -
TC_20_AudioVM_Pulse_whonix-workstation-17: test_252_audio_playback_audiovm_switch_hvm (error)
qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seco...
-
-
system_tests_audio@hw1
-
system_tests: Fail (unknown)
Tests qubes.tests.integ.audio failed (exit code 1), details reporte... -
system_tests: Failed (test died)
# Test died: Some tests failed at qubesos/tests/system_tests.pm lin... -
TC_20_AudioVM_Pulse_whonix-workstation-17: test_223_audio_play_hvm (error)
qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon... -
TC_20_AudioVM_Pulse_whonix-workstation-17: test_224_audio_rec_muted_hvm (error)
qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon... -
TC_20_AudioVM_Pulse_whonix-workstation-17: test_252_audio_playback_audiovm_switch_hvm (error)
qubes.exc.QubesVMError: Cannot connect to qrexec agent for 60 secon...
-
-
system_tests_dispvm_perf@hw7
-
system_tests: wait_serial (wait serial expected)
# wait_serial expected: qr/D2qNh-\d+-/... -
TC_00_DispVMPerf_debian-13-xfce: test_000_dispvm (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_002_dispvm_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_006_dispvm_from_dom0 (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_008_dispvm_from_dom0_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_020_dispvm_preload (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_021_dispvm_preload_gui (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_022_dispvm_preload_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_023_dispvm_preload_gui_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_026_dispvm_from_dom0_preload (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_027_dispvm_from_dom0_preload_gui (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_028_dispvm_from_dom0_preload_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_029_dispvm_from_dom0_preload_gui_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_400_dispvm_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_402_dispvm_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_404_dispvm_preload_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_404_dispvm_preload_less_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_404_dispvm_preload_more_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_405_dispvm_preload_gui_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_406_dispvm_preload_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_407_dispvm_preload_gui_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_900_vm (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_902_vm_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_904_vm_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_debian-13-xfce: test_906_vm_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_020_dispvm_preload (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_021_dispvm_preload_gui (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_022_dispvm_preload_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_023_dispvm_preload_gui_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_026_dispvm_from_dom0_preload (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_027_dispvm_from_dom0_preload_gui (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_028_dispvm_from_dom0_preload_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_029_dispvm_from_dom0_preload_gui_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_404_dispvm_preload_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_404_dispvm_preload_less_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_404_dispvm_preload_more_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_405_dispvm_preload_gui_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_406_dispvm_preload_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_fedora-42-xfce: test_407_dispvm_preload_gui_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_020_dispvm_preload (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_021_dispvm_preload_gui (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_022_dispvm_preload_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_023_dispvm_preload_gui_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_026_dispvm_from_dom0_preload (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_027_dispvm_from_dom0_preload_gui (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_028_dispvm_from_dom0_preload_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_029_dispvm_from_dom0_preload_gui_concurrent (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_404_dispvm_preload_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_404_dispvm_preload_less_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_404_dispvm_preload_more_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_405_dispvm_preload_gui_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_406_dispvm_preload_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins... -
TC_00_DispVMPerf_whonix-workstation-17: test_407_dispvm_preload_gui_concurrent_api (failure)
AssertionError: '/usr/lib/qubes/tests/dispvm_perf.py --dvm=test-ins...
-
-
system_tests_guivm_gpu_gui_interactive@hw13
-
guivm_startup: wait_serial (wait serial expected)
# wait_serial expected: qr/lEcbc-\d+-/... -
guivm_startup: Failed (test died + timed out)
# Test died: command '! qvm-check sys-whonix || time qvm-start sys-...
-
-
system_tests_basic_vm_qrexec_gui_ext4
-
system_tests: Fail (unknown)
Tests qubes.tests.integ.vm_qrexec_gui failed (exit code 1), details... -
TC_20_NonAudio_whonix-gateway-17-pool: test_012_qubes_desktop_run (error + cleanup)
raise TimeoutError from exc_val... TimeoutError
-
Unstable tests
Performance Tests
Performance degradation:
18 performance degradations
- dom0_root_seq1m_q8t1_write 3:write_bandwidth_kb: 86415.00 🔻 ( previous job: 265260.00, degradation: 32.58%)
- dom0_root_seq1m_q1t1_read 3:read_bandwidth_kb: 170427.00 🔻 ( previous job: 431512.00, degradation: 39.50%)
- dom0_root_seq1m_q1t1_write 3:write_bandwidth_kb: 100091.00 🔻 ( previous job: 196254.00, degradation: 51.00%)
- dom0_root_rnd4k_q32t1_read 3:read_bandwidth_kb: 13637.00 🔻 ( previous job: 23940.00, degradation: 56.96%)
- dom0_root_rnd4k_q32t1_write 3:write_bandwidth_kb: 2090.00 🔻 ( previous job: 2446.00, degradation: 85.45%)
- fedora-42-xfce_root_seq1m_q8t1_write 3:write_bandwidth_kb: 117773.00 🔻 ( previous job: 140215.00, degradation: 83.99%)
- fedora-42-xfce_root_seq1m_q1t1_read 3:read_bandwidth_kb: 224968.00 🔻 ( previous job: 299764.00, degradation: 75.05%)
- fedora-42-xfce_root_seq1m_q1t1_write 3:write_bandwidth_kb: 32680.00 🔻 ( previous job: 47575.00, degradation: 68.69%)
- fedora-42-xfce_root_rnd4k_q32t1_write 3:write_bandwidth_kb: 2496.00 🔻 ( previous job: 3020.00, degradation: 82.65%)
- fedora-42-xfce_root_rnd4k_q1t1_write 3:write_bandwidth_kb: 871.00 🔻 ( previous job: 1368.00, degradation: 63.67%)
- fedora-42-xfce_private_seq1m_q8t1_write 3:write_bandwidth_kb: 79735.00 🔻 ( previous job: 136640.00, degradation: 58.35%)
- fedora-42-xfce_private_seq1m_q1t1_read 3:read_bandwidth_kb: 223529.00 🔻 ( previous job: 325139.00, degradation: 68.75%)
- fedora-42-xfce_private_seq1m_q1t1_write 3:write_bandwidth_kb: 64089.00 🔻 ( previous job: 79539.00, degradation: 80.58%)
- fedora-42-xfce_private_rnd4k_q32t1_write 3:write_bandwidth_kb: 1946.00 🔻 ( previous job: 3765.00, degradation: 51.69%)
- fedora-42-xfce_private_rnd4k_q1t1_read 3:read_bandwidth_kb: 7799.00 🔻 ( previous job: 8992.00, degradation: 86.73%)
- fedora-42-xfce_private_rnd4k_q1t1_write 3:write_bandwidth_kb: 500.00 🔻 ( previous job: 1251.00, degradation: 39.97%)
- fedora-42-xfce_volatile_seq1m_q8t1_read 3:read_bandwidth_kb: 343232.00 🔻 ( previous job: 383531.00, degradation: 89.49%)
- fedora-42-xfce_volatile_seq1m_q1t1_write 3:write_bandwidth_kb: 45374.00 🔻 ( previous job: 64217.00, degradation: 70.66%)
Remaining performance tests:
157 tests
- debian-13-xfce_exec: 8.00 🟢 ( previous job: 8.36, improvement: 95.66%)
- debian-13-xfce_exec-root: 27.83 🔻 ( previous job: 27.36, degradation: 101.71%)
- debian-13-xfce_socket: 8.54 🟢 ( previous job: 8.57, improvement: 99.59%)
- debian-13-xfce_socket-root: 8.43 🔻 ( previous job: 8.26, degradation: 102.03%)
- debian-13-xfce_exec-data-simplex: 66.03 🟢 ( previous job: 72.43, improvement: 91.17%)
- debian-13-xfce_exec-data-duplex: 68.92 🟢 ( previous job: 76.65, improvement: 89.92%)
- debian-13-xfce_exec-data-duplex-root: 83.49 🟢 ( previous job: 91.79, improvement: 90.96%)
- debian-13-xfce_socket-data-duplex: 129.44 🟢 ( previous job: 133.45, improvement: 97.00%)
- fedora-42-xfce_exec: 9.12 🔻 ( previous job: 9.06, degradation: 100.61%)
- fedora-42-xfce_exec-root: 59.95 🔻 ( previous job: 58.19, degradation: 103.02%)
- fedora-42-xfce_socket: 8.23 🟢 ( previous job: 8.48, improvement: 97.06%)
- fedora-42-xfce_socket-root: 8.15 🟢 ( previous job: 8.18, improvement: 99.65%)
- fedora-42-xfce_exec-data-simplex: 64.02 🟢 ( previous job: 78.48, improvement: 81.58%)
- fedora-42-xfce_exec-data-duplex: 68.05 🔻 ( previous job: 67.92, degradation: 100.19%)
- fedora-42-xfce_exec-data-duplex-root: 96.98 🔻 ( previous job: 96.36, degradation: 100.64%)
- fedora-42-xfce_socket-data-duplex: 133.19 🟢 ( previous job: 142.58, improvement: 93.41%)
- whonix-gateway-18_exec: 7.59
- whonix-gateway-18_exec-root: 142.22
- whonix-gateway-18_socket: 8.00
- whonix-gateway-18_socket-root: 8.28
- whonix-gateway-18_exec-data-simplex: 66.63
- whonix-gateway-18_exec-data-duplex: 75.15
- whonix-gateway-18_exec-data-duplex-root: 145.39
- whonix-gateway-18_socket-data-duplex: 144.97
- whonix-workstation-18_exec: 8.59
- whonix-workstation-18_exec-root: 139.20
- whonix-workstation-18_socket: 9.51
- whonix-workstation-18_socket-root: 9.16
- whonix-workstation-18_exec-data-simplex: 68.73
- whonix-workstation-18_exec-data-duplex: 72.67
- whonix-workstation-18_exec-data-duplex-root: 147.67
- whonix-workstation-18_socket-data-duplex: 135.77
- dom0_root_seq1m_q8t1_read 3:read_bandwidth_kb: 479020.00 🔻 ( previous job: 497426.00, degradation: 96.30%)
- dom0_root_rnd4k_q1t1_read 3:read_bandwidth_kb: 12193.00 🟢 ( previous job: 5874.00, improvement: 207.58%)
- dom0_root_rnd4k_q1t1_write 3:write_bandwidth_kb: 4439.00 🟢 ( previous job: 29.00, improvement: 15306.90%)
- dom0_varlibqubes_seq1m_q8t1_read 3:read_bandwidth_kb: 445255.00 🟢 ( previous job: 292489.00, improvement: 152.23%)
- dom0_varlibqubes_seq1m_q8t1_write 3:write_bandwidth_kb: 161989.00 🟢 ( previous job: 110817.00, improvement: 146.18%)
- dom0_varlibqubes_seq1m_q1t1_read 3:read_bandwidth_kb: 428514.00 🟢 ( previous job: 137802.00, improvement: 310.96%)
- dom0_varlibqubes_seq1m_q1t1_write 3:write_bandwidth_kb: 187964.00 🟢 ( previous job: 121719.00, improvement: 154.42%)
- dom0_varlibqubes_rnd4k_q32t1_read 3:read_bandwidth_kb: 100979.00 🔻 ( previous job: 103932.00, degradation: 97.16%)
- dom0_varlibqubes_rnd4k_q32t1_write 3:write_bandwidth_kb: 9198.00 🟢 ( previous job: 6356.00, improvement: 144.71%)
- dom0_varlibqubes_rnd4k_q1t1_read 3:read_bandwidth_kb: 8196.00 🟢 ( previous job: 7695.00, improvement: 106.51%)
- dom0_varlibqubes_rnd4k_q1t1_write 3:write_bandwidth_kb: 4867.00 🟢 ( previous job: 3925.00, improvement: 124.00%)
- fedora-42-xfce_root_seq1m_q8t1_read 3:read_bandwidth_kb: 415606.00 🟢 ( previous job: 366891.00, improvement: 113.28%)
- fedora-42-xfce_root_rnd4k_q32t1_read 3:read_bandwidth_kb: 81172.00 🔻 ( previous job: 86001.00, degradation: 94.38%)
- fedora-42-xfce_root_rnd4k_q1t1_read 3:read_bandwidth_kb: 8203.00 🔻 ( previous job: 9042.00, degradation: 90.72%)
- fedora-42-xfce_private_seq1m_q8t1_read 3:read_bandwidth_kb: 383672.00 🔻 ( previous job: 387500.00, degradation: 99.01%)
- fedora-42-xfce_private_rnd4k_q32t1_read 3:read_bandwidth_kb: 83788.00 🔻 ( previous job: 87396.00, degradation: 95.87%)
- fedora-42-xfce_volatile_seq1m_q8t1_write 3:write_bandwidth_kb: 143908.00 🔻 ( previous job: 157382.00, degradation: 91.44%)
- fedora-42-xfce_volatile_seq1m_q1t1_read 3:read_bandwidth_kb: 311705.00 🟢 ( previous job: 293225.00, improvement: 106.30%)
- fedora-42-xfce_volatile_rnd4k_q32t1_read 3:read_bandwidth_kb: 87922.00 🟢 ( previous job: 87141.00, improvement: 100.90%)
- fedora-42-xfce_volatile_rnd4k_q32t1_write 3:write_bandwidth_kb: 3802.00 🔻 ( previous job: 4098.00, degradation: 92.78%)
- fedora-42-xfce_volatile_rnd4k_q1t1_read 3:read_bandwidth_kb: 8237.00 🔻 ( previous job: 8804.00, degradation: 93.56%)
- fedora-42-xfce_volatile_rnd4k_q1t1_write 3:write_bandwidth_kb: 2619.00 🟢 ( previous job: 2384.00, improvement: 109.86%)
- debian-13-xfce_vm-dispvm (mean:6.595): 79.14
- debian-13-xfce_vm-dispvm-gui (mean:7.213): 86.55
- debian-13-xfce_vm-dispvm-concurrent (mean:3.446): 41.35
- debian-13-xfce_vm-dispvm-gui-concurrent (mean:4.021): 48.25
- debian-13-xfce_dom0-dispvm (mean:6.848): 82.17
- debian-13-xfce_dom0-dispvm-gui (mean:7.953): 95.43
- debian-13-xfce_dom0-dispvm-concurrent (mean:3.416): 41.00
- debian-13-xfce_dom0-dispvm-gui-concurrent (mean:4.34): 52.08
- debian-13-xfce_vm-dispvm-preload (mean:2.791): 33.50
- debian-13-xfce_vm-dispvm-preload-gui (mean:3.99): 47.88
- debian-13-xfce_vm-dispvm-preload-concurrent (mean:2.636): 31.63
- debian-13-xfce_vm-dispvm-preload-gui-concurrent (mean:3.39): 40.68
- debian-13-xfce_dom0-dispvm-preload (mean:3.55): 42.60
- debian-13-xfce_dom0-dispvm-preload-gui (mean:4.406): 52.88
- debian-13-xfce_dom0-dispvm-preload-concurrent (mean:3.344): 40.13
- debian-13-xfce_dom0-dispvm-preload-gui-concurrent (mean:3.823): 45.88
- debian-13-xfce_dom0-dispvm-api (mean:7.073): 84.88
- debian-13-xfce_dom0-dispvm-gui-api (mean:8.169): 98.03
- debian-13-xfce_dom0-dispvm-concurrent-api (mean:3.154): 37.85
- debian-13-xfce_dom0-dispvm-gui-concurrent-api (mean:4.046): 48.56
- debian-13-xfce_dom0-dispvm-preload-less-less-api (mean:3.663): 43.95
- debian-13-xfce_dom0-dispvm-preload-less-api (mean:3.792): 45.50
- debian-13-xfce_dom0-dispvm-preload-api (mean:3.652): 43.83
- debian-13-xfce_dom0-dispvm-preload-more-api (mean:3.387): 40.65
- debian-13-xfce_dom0-dispvm-preload-more-more-api (mean:3.724): 44.69
- debian-13-xfce_dom0-dispvm-preload-gui-api (mean:4.51): 54.12
- debian-13-xfce_dom0-dispvm-preload-concurrent-api (mean:2.876): 34.51
- debian-13-xfce_dom0-dispvm-preload-gui-concurrent-api (mean:3.833): 45.99
- debian-13-xfce_vm-vm (mean:0.033): 0.39
- debian-13-xfce_vm-vm-gui (mean:0.033): 0.40
- debian-13-xfce_vm-vm-concurrent (mean:0.019): 0.23
- debian-13-xfce_vm-vm-gui-concurrent (mean:0.026): 0.31
- debian-13-xfce_dom0-vm-api (mean:0.04): 0.48
- debian-13-xfce_dom0-vm-gui-api (mean:0.048): 0.58
- debian-13-xfce_dom0-vm-concurrent-api (mean:0.03): 0.36
- debian-13-xfce_dom0-vm-gui-concurrent-api (mean:0.033): 0.40
- fedora-42-xfce_vm-dispvm (mean:7.026): 84.32
- fedora-42-xfce_vm-dispvm-gui (mean:8.224): 98.69
- fedora-42-xfce_vm-dispvm-concurrent (mean:3.626): 43.51
- fedora-42-xfce_vm-dispvm-gui-concurrent (mean:4.356): 52.28
- fedora-42-xfce_dom0-dispvm (mean:7.502): 90.02
- fedora-42-xfce_dom0-dispvm-gui (mean:8.736): 104.84
- fedora-42-xfce_dom0-dispvm-concurrent (mean:3.756): 45.07
- fedora-42-xfce_dom0-dispvm-gui-concurrent (mean:4.641): 55.69
- fedora-42-xfce_vm-dispvm-preload (mean:3.082): 36.98
- fedora-42-xfce_vm-dispvm-preload-gui (mean:6.558): 78.69
- fedora-42-xfce_vm-dispvm-preload-concurrent (mean:2.923): 35.08
- fedora-42-xfce_vm-dispvm-preload-gui-concurrent (mean:5.069): 60.83
- fedora-42-xfce_dom0-dispvm-preload (mean:3.748): 44.97
- fedora-42-xfce_dom0-dispvm-preload-gui (mean:5.057): 60.68
- fedora-42-xfce_dom0-dispvm-preload-concurrent (mean:3.545): 42.54
- fedora-42-xfce_dom0-dispvm-preload-gui-concurrent (mean:4.176): 50.11
- fedora-42-xfce_dom0-dispvm-api (mean:7.673): 92.08
- fedora-42-xfce_dom0-dispvm-gui-api (mean:9.083): 109.00
- fedora-42-xfce_dom0-dispvm-concurrent-api (mean:3.667): 44.00
- fedora-42-xfce_dom0-dispvm-gui-concurrent-api (mean:4.432): 53.19
- fedora-42-xfce_dom0-dispvm-preload-less-less-api (mean:4.333): 51.99
- fedora-42-xfce_dom0-dispvm-preload-less-api (mean:4.135): 49.61
- fedora-42-xfce_dom0-dispvm-preload-api (mean:3.765): 45.18
- fedora-42-xfce_dom0-dispvm-preload-more-api (mean:3.934): 47.21
- fedora-42-xfce_dom0-dispvm-preload-more-more-api (mean:3.816): 45.79
- fedora-42-xfce_dom0-dispvm-preload-gui-api (mean:4.901): 58.81
- fedora-42-xfce_dom0-dispvm-preload-concurrent-api (mean:3.399): 40.79
- fedora-42-xfce_dom0-dispvm-preload-gui-concurrent-api (mean:4.197): 50.37
- fedora-42-xfce_vm-vm (mean:0.028): 0.33
- fedora-42-xfce_vm-vm-gui (mean:0.024): 0.29
- fedora-42-xfce_vm-vm-concurrent (mean:0.014): 0.17
- fedora-42-xfce_vm-vm-gui-concurrent (mean:0.019): 0.23
- fedora-42-xfce_dom0-vm-api (mean:0.035): 0.42
- fedora-42-xfce_dom0-vm-gui-api (mean:0.04): 0.47
- fedora-42-xfce_dom0-vm-concurrent-api (mean:0.03): 0.36
- fedora-42-xfce_dom0-vm-gui-concurrent-api (mean:0.028): 0.34
- whonix-workstation-18_vm-dispvm (mean:9.173): 110.07
- whonix-workstation-18_vm-dispvm-gui (mean:10.536): 126.43
- whonix-workstation-18_vm-dispvm-gui-concurrent (mean:5.664): 67.97
- whonix-workstation-18_dom0-dispvm (mean:9.459): 113.51
- whonix-workstation-18_dom0-dispvm-gui (mean:10.464): 125.56
- whonix-workstation-18_vm-dispvm-preload (mean:4.03): 48.37
- whonix-workstation-18_vm-dispvm-preload-gui (mean:5.698): 68.38
- whonix-workstation-18_vm-dispvm-preload-concurrent (mean:3.905): 46.86
- whonix-workstation-18_vm-dispvm-preload-gui-concurrent (mean:4.908): 58.89
- whonix-workstation-18_dom0-dispvm-preload (mean:5.041): 60.49
- whonix-workstation-18_dom0-dispvm-preload-gui (mean:6.076): 72.91
- whonix-workstation-18_dom0-dispvm-preload-concurrent (mean:4.402): 52.83
- whonix-workstation-18_dom0-dispvm-preload-gui-concurrent (mean:5.093): 61.11
- whonix-workstation-18_dom0-dispvm-api (mean:9.493): 113.92
- whonix-workstation-18_dom0-dispvm-gui-api (mean:10.75): 129.00
- whonix-workstation-18_dom0-dispvm-gui-concurrent-api (mean:5.476): 65.71
- whonix-workstation-18_dom0-dispvm-preload-less-less-api (mean:5.583): 67.00
- whonix-workstation-18_dom0-dispvm-preload-less-api (mean:5.427): 65.13
- whonix-workstation-18_dom0-dispvm-preload-api (mean:5.419): 65.03
- whonix-workstation-18_dom0-dispvm-preload-more-api (mean:5.494): 65.93
- whonix-workstation-18_dom0-dispvm-preload-more-more-api (mean:5.455): 65.47
- whonix-workstation-18_dom0-dispvm-preload-concurrent-api (mean:4.696): 56.36
- whonix-workstation-18_dom0-dispvm-preload-gui-concurrent-api (mean:5.39): 64.67
- whonix-workstation-18_vm-vm (mean:0.046): 0.55
- whonix-workstation-18_vm-vm-gui (mean:0.047): 0.57
- whonix-workstation-18_vm-vm-concurrent (mean:0.025): 0.31
- whonix-workstation-18_vm-vm-gui-concurrent (mean:0.029): 0.35
- whonix-workstation-18_dom0-vm-api (mean:0.037): 0.44
- whonix-workstation-18_dom0-vm-gui-api (mean:0.054): 0.65
- whonix-workstation-18_dom0-vm-concurrent-api (mean:0.029): 0.34
- whonix-workstation-18_dom0-vm-gui-concurrent-api (mean:0.025): 0.30
Volume in use was raised for 2 tests, race_less for Fedora and Whonix-Workstation. It was not on tearDown this time, it was after unpausing a domain to be used. Other tests only saved a dependency graph image, which was not uploaded.
The bot posted info from the last run that finished, not the last run that was scheduled. Because the last run that finished took a long time to finish and was scheduled prior to the last run, it contains outdated commits.
- Last run that finished -> 2025110317-4.3
- Last run that was scheduled -> 2025110502-4.3
From 2025110502-4.3:
Details
======================================================================
ERROR: qubes.tests.integ.dispvm/TC_20_DispVM_whonix-workstation-18/test_015_preload_race_more
Test race requesting multiple preloaded qubes
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib64/python3.13/asyncio/tasks.py", line 507, in wait_for
return await fut
^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/subprocess.py", line 201, in communicate
stdin, stdout, stderr = await tasks.gather(stdin, stdout, stderr)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/subprocess.py", line 181, in _read_stream
output = await stream.read()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/streams.py", line 723, in read
block = await self.read(self._limit)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/streams.py", line 730, in read
await self._wait_for_data('read')
File "/usr/lib64/python3.13/asyncio/streams.py", line 539, in _wait_for_data
await self._waiter
asyncio.exceptions.CancelledError
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib64/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 600, in test_015_preload_race_more
self.loop.run_until_complete(self._test_015_preload_race_more())
~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/base_events.py", line 725, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 616, in _test_015_preload_race_more
targets = await asyncio.gather(*tasks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 433, in run_preload_proc
stdout, _ = await asyncio.wait_for(proc.communicate(), timeout=60)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/tasks.py", line 506, in wait_for
async with timeouts.timeout(timeout):
~~~~~~~~~~~~~~~~^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/timeouts.py", line 116, in __aexit__
raise TimeoutError from exc_val
TimeoutError
======================================================================
ERROR: qubes.tests.integ.dispvm/TC_20_DispVM_whonix-workstation-18/test_017_preload_autostart
The script triggers the API call
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib64/python3.13/asyncio/tasks.py", line 507, in wait_for
return await fut
^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/subprocess.py", line 137, in wait
return await self._transport._wait()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/base_subprocess.py", line 253, in _wait
return await waiter
^^^^^^^^^^^^
asyncio.exceptions.CancelledError
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib64/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 682, in test_017_preload_autostart
self.loop.run_until_complete(asyncio.wait_for(proc.wait(), timeout=40))
~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/base_events.py", line 725, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File "/usr/lib64/python3.13/asyncio/tasks.py", line 506, in wait_for
async with timeouts.timeout(timeout):
~~~~~~~~~~~~~~~~^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/timeouts.py", line 116, in __aexit__
raise TimeoutError from exc_val
TimeoutError
======================================================================
FAIL: qubes.tests.integ.dispvm/TC_20_DispVM_whonix-workstation-18/test_030_edit_file
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib64/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 1078, in test_030_edit_file
self.fail("Timeout waiting for editor window")
~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError: Timeout waiting for editor window
======================================================================
FAIL: qubes.tests.integ.dispvm/TC_20_DispVM_whonix-workstation-18/test_100_open_in_dispvm
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 1199, in test_100_open_in_dispvm
click_to_open = self.loop.run_until_complete(
self.testvm1.run_for_stdio(
...<3 lines>...
)
)
File "/usr/lib64/python3.13/asyncio/base_events.py", line 725, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
subprocess.CalledProcessError: Command './open-file test.txt' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib64/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 1211, in test_100_open_in_dispvm
self.fail(
~~~~~~~~~^
"'./open-file test.txt' failed with {}: {}{}".format(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
err.cmd, err.returncode, err.stdout, err.stderr
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
)
^
AssertionError: './open-file test.txt' failed with ./open-file test.txt: 1b'Creating logfile at /tmp/dogtail-user/logs/open-file_20251105-053322_debug ...\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 3)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 4)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 5)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 6)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 7)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 8)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 9)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 10)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 11)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 12)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 13)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 14)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 15)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 16)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 17)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 18)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 19)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 20)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 21)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 22)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 23)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 24)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 25)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 26)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 27)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 28)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 29)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 30)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 31)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 32)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 33)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 34)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 35)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 36)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 37)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 38)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 39)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 40)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 41)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 42)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 43)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 44)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 45)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 46)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 47)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 48)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 49)\nTraceback (most recent call last):\n File "/home/user/./open-file", line 122, in <module>\n main()\n ~~~~^^\n File "/home/user/./open-file", line 110, in main\n tb = get_app()\n File "/home/user/./open-file", line 44, in get_app\n tree.root.application(\'Thunderbird|Icedove\')\n ~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^\n File "/usr/lib/python3/dist-packages/dogtail/tree.py", line 1668, in application\n return root.findChild(predicate.IsAnApplicationNamed(appName), recursive=False, retry=retry, showingOnly=False)\n ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/usr/lib/python3/dist-packages/dogtail/tree.py", line 1357, in findChild\n raise SearchError(describeSearch(self, pred, recursive, debugName))\ndogtail.tree.SearchError: child of [desktop frame | main]: \'Thunderbird|Icedove\' application\n\n'
----------------------------------------------------------------------
Ran 56 tests in 5844.979s
FAILED (failures=2, errors=2, skipped=13)
Most recent run also failed to upload details, but there is a log: https://openqa.qubes-os.org/tests/158674/file/system_tests-tests.log
Details
======================================================================
ERROR: qubes.tests.integ.dispvm/TC_20_DispVM_debian-13-xfce/test_015_preload_race_more
Test race requesting multiple preloaded qubes
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib64/python3.13/asyncio/tasks.py", line 507, in wait_for
return await fut
^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/subprocess.py", line 201, in communicate
stdin, stdout, stderr = await tasks.gather(stdin, stdout, stderr)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/subprocess.py", line 181, in _read_stream
output = await stream.read()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/streams.py", line 723, in read
block = await self.read(self._limit)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/streams.py", line 730, in read
await self._wait_for_data('read')
File "/usr/lib64/python3.13/asyncio/streams.py", line 539, in _wait_for_data
await self._waiter
asyncio.exceptions.CancelledError
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib64/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 600, in test_015_preload_race_more
self.loop.run_until_complete(self._test_015_preload_race_more())
~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/base_events.py", line 725, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 616, in _test_015_preload_race_more
targets = await asyncio.gather(*tasks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 433, in run_preload_proc
stdout, _ = await asyncio.wait_for(proc.communicate(), timeout=60)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/tasks.py", line 506, in wait_for
async with timeouts.timeout(timeout):
~~~~~~~~~~~~~~~~^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/timeouts.py", line 116, in __aexit__
raise TimeoutError from exc_val
TimeoutError
======================================================================
ERROR: qubes.tests.integ.dispvm/TC_20_DispVM_whonix-workstation-18/test_017_preload_autostart
The script triggers the API call
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib64/python3.13/asyncio/tasks.py", line 507, in wait_for
return await fut
^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/subprocess.py", line 137, in wait
return await self._transport._wait()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/base_subprocess.py", line 253, in _wait
return await waiter
^^^^^^^^^^^^
asyncio.exceptions.CancelledError
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib64/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 682, in test_017_preload_autostart
self.loop.run_until_complete(asyncio.wait_for(proc.wait(), timeout=40))
~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/base_events.py", line 725, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File "/usr/lib64/python3.13/asyncio/tasks.py", line 506, in wait_for
async with timeouts.timeout(timeout):
~~~~~~~~~~~~~~~~^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/timeouts.py", line 116, in __aexit__
raise TimeoutError from exc_val
TimeoutError
======================================================================
FAIL: qubes.tests.integ.dispvm/TC_20_DispVM_whonix-workstation-18/test_015_preload_race_more
Test race requesting multiple preloaded qubes
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib64/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 600, in test_015_preload_race_more
self.loop.run_until_complete(self._test_015_preload_race_more())
~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib64/python3.13/asyncio/base_events.py", line 725, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 613, in _test_015_preload_race_more
await self.wait_preload(preload_max)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 410, in wait_preload
self.fail("last preloaded didn't complete in time")
~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError: last preloaded didn't complete in time
======================================================================
FAIL: qubes.tests.integ.dispvm/TC_20_DispVM_whonix-workstation-18/test_030_edit_file
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib64/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 1078, in test_030_edit_file
self.fail("Timeout waiting for editor window")
~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError: Timeout waiting for editor window
======================================================================
FAIL: qubes.tests.integ.dispvm/TC_20_DispVM_whonix-workstation-18/test_100_open_in_dispvm
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 1199, in test_100_open_in_dispvm
click_to_open = self.loop.run_until_complete(
self.testvm1.run_for_stdio(
...<3 lines>...
)
)
File "/usr/lib64/python3.13/asyncio/base_events.py", line 725, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
subprocess.CalledProcessError: Command './open-file test.txt' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib64/python3.13/contextlib.py", line 85, in inner
return func(*args, **kwds)
File "/usr/lib/python3.13/site-packages/qubes/tests/integ/dispvm.py", line 1211, in test_100_open_in_dispvm
self.fail(
~~~~~~~~~^
"'./open-file test.txt' failed with {}: {}{}".format(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
err.cmd, err.returncode, err.stdout, err.stderr
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
)
^
AssertionError: './open-file test.txt' failed with ./open-file test.txt: 1b'Creating logfile at /tmp/dogtail-user/logs/open-file_20251105-165630_debug ...\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 3)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 4)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 5)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 6)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 7)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 8)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 9)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 10)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 11)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 12)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 13)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 14)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 15)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 16)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 17)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 18)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 19)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 20)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 21)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 22)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 23)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 24)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 25)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 26)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 27)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 28)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 29)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 30)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 31)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 32)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 33)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 34)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 35)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 36)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 37)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 38)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 39)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 40)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 41)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 42)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 43)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 44)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 45)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 46)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 47)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 48)\nSearching for child of [desktop frame | main]: \'Thunderbird|Icedove\' application (attempt 49)\nTraceback (most recent call last):\n File "/home/user/./open-file", line 122, in <module>\n main()\n ~~~~^^\n File "/home/user/./open-file", line 110, in main\n tb = get_app()\n File "/home/user/./open-file", line 44, in get_app\n tree.root.application(\'Thunderbird|Icedove\')\n ~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^\n File "/usr/lib/python3/dist-packages/dogtail/tree.py", line 1668, in application\n return root.findChild(predicate.IsAnApplicationNamed(appName), recursive=False, retry=retry, showingOnly=False)\n ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File "/usr/lib/python3/dist-packages/dogtail/tree.py", line 1357, in findChild\n raise SearchError(describeSearch(self, pred, recursive, debugName))\ndogtail.tree.SearchError: child of [desktop frame | main]: \'Thunderbird|Icedove\' application\n\n'
----------------------------------------------------------------------
Ran 56 tests in 5761.980s
FAILED (failures=3, errors=2, skipped=13)
https://openqa.qubes-os.org/tests/158674/file/system_tests-var_log.tar.gz
Debian template OOM while updating?
[2025-11-05 09:10:43] [ 41.643910] qrexec-agent[990]: 2025-11-05 09:10:43.928 qrexec-agent[990]: qrexec-agent-data.c:274:handle_new_process_common: executed: root:QUBESRPC qubes.VMExec+tar+--xzf+-2Frun-2Fqubes--update-2Fagent.tar.gz+--C+-2Frun-2Fqubes--update-2F dom0 (pid 993)
[2025-11-05 09:10:43] [ 41.652630] qrexec-agent[993]: pam_unix(qrexec:session): session opened for user root(uid=0) by (uid=0)
[2025-11-05 09:10:43] [ 41.670836] fbcon: Taking over console
[2025-11-05 09:10:43] [ 41.680747] systemd[1]: Started session-c5.scope - Session c5 of User root.
[2025-11-05 09:10:44] [ 41.765664] wireplumber[961]: [0;1;39mwp-internal-comp-loader: Loading profile 'main'[0m
[2025-11-05 09:10:44] [ 41.847737] qrexec-agent[993]: pam_unix(qrexec:session): session closed for user root
[2025-11-05 09:10:44] [ 41.858525] qrexec-agent[990]: 2025-11-05 09:10:44.143 qrexec-agent[990]: qrexec-agent-data.c:309:handle_new_process_common: pid 993 exited with 0
[2025-11-05 09:10:44] [ 41.858730] systemd[1]: session-c5.scope: Deactivated successfully.
[2025-11-05 09:10:44] [ 41.910745] qrexec-agent[1008]: 2025-11-05 09:10:44.194 qrexec-agent[1008]: qrexec-agent-data.c:274:handle_new_process_common: executed: root:QUBESRPC qubes.VMExec+chmod+u-2Bx+-2Frun-2Fqubes--update-2Fagent-2Fentrypoint.py dom0 (pid 1010)
[2025-11-05 09:10:44] [ 41.917798] qrexec-agent[1010]: pam_unix(qrexec:session): session opened for user root(uid=0) by (uid=0)
[2025-11-05 09:10:44] [ 41.954350] systemd[1]: Started session-c6.scope - Session c6 of User root.
[2025-11-05 09:10:44] [ 42.002878] audit: type=1400 audit(1762351844.277:119): apparmor="ALLOWED" operation="open" class="file" profile="Xorg" name="/dev/xen/gntalloc" pid=978 comm="Xorg" requested_mask="wr" denied_mask="wr" fsuid=1000 ouid=0
[2025-11-05 09:10:44] [ 42.066436] qrexec-agent[1010]: pam_unix(qrexec:session): session closed for user root
[2025-11-05 09:10:44] [ 42.074536] qrexec-agent[1008]: 2025-11-05 09:10:44.357 qrexec-agent[1008]: qrexec-agent-data.c:309:handle_new_process_common: pid 1010 exited with 0
[2025-11-05 09:10:44] [ 42.076891] systemd[1]: session-c6.scope: Deactivated successfully.
[2025-11-05 09:10:44] [ 42.113591] qrexec-agent[1019]: 2025-11-05 09:10:44.397 qrexec-agent[1019]: qrexec-agent-data.c:274:handle_new_process_common: executed: root:QUBESRPC qubes.VMExec+-2Fusr-2Fbin-2Fpython3+-2Frun-2Fqubes--update-2Fagent-2Fentrypoint.py+----log+DEBUG+----show--output dom0 (pid 1021)
[2025-11-05 09:10:44] [ 42.121857] qrexec-agent[1021]: pam_unix(qrexec:session): session opened for user root(uid=0) by (uid=0)
[2025-11-05 09:10:44] [ 42.140281] systemd[1]: Started session-c7.scope - Session c7 of User root.
[2025-11-05 09:10:44] [ 42.254084] wireplumber[961]: [0;1;39mwp-device: SPA handle 'api.libcamera.enum.manager' could not be loaded; is it installed?[0m
[2025-11-05 09:10:44] [ 42.254350] wireplumber[961]: [0;1;39ms-monitors-libcamera: PipeWire's libcamera SPA plugin is missing or broken. Some camera types may not be supported.[0m
[2025-11-05 09:10:44] [ 42.349081] Fallback order for Node 0: 0
[2025-11-05 09:10:44] [ 42.349095] Built 1 zonelists, mobility grouping on. Total pages: 86769
[2025-11-05 09:10:44] [ 42.350061] Policy zone: Normal
[2025-11-05 09:10:44] [ 42.427815] systemd-udevd[357]: [0;1;38:5:185mConfiguration file /usr/lib/udev/rules.d/80-qubes-usb-reset.rules is marked executable. Please remove executable permission bits. Proceeding anyway.[0m
[2025-11-05 09:10:44] [ 42.434162] systemd-udevd[357]: [0;1;31m/usr/lib/udev/rules.d/90-alsa-restore.rules:18 GOTO="alsa_restore_std" has no matching label, ignoring.[0m
[2025-11-05 09:10:44] [ 42.436798] systemd-udevd[357]: [0;1;38:5:185m/usr/lib/udev/rules.d/90-alsa-restore.rules:18 The line has no effect any more, dropping.[0m
[2025-11-05 09:10:44] [ 42.436984] systemd-udevd[357]: [0;1;31m/usr/lib/udev/rules.d/90-alsa-restore.rules:22 GOTO="alsa_restore_std" has no matching label, ignoring.[0m
[2025-11-05 09:10:44] [ 42.437135] systemd-udevd[357]: [0;1;38:5:185m/usr/lib/udev/rules.d/90-alsa-restore.rules:22 The line has no effect any more, dropping.[0m
[2025-11-05 09:10:45]
[2025-11-05 09:10:45] Debian GNU/Linux 13 debian-13-xfce hvc0
[2025-11-05 09:10:45]
[2025-11-05 09:10:45] debian-13-xfce login: [ 43.101727] systemd[1]: Created slice system-qubes\x2dupdates\x2dproxy\x2dforwarder.slice - Slice /system/qubes-updates-proxy-forwarder.
[2025-11-05 09:10:45] [ 43.121917] systemd[1]: Started [email protected]:8082-127.0.0.1:41604.service - Forward connection to updates proxy over Qubes RPC (127.0.0.1:41604).
[2025-11-05 09:10:45] [ 43.143436] systemd[1]: Started [email protected]:8082-127.0.0.1:41618.service - Forward connection to updates proxy over Qubes RPC (127.0.0.1:41618).
[2025-11-05 09:10:45] [ 43.163898] systemd[1]: Started [email protected]:8082-127.0.0.1:41622.service - Forward connection to updates proxy over Qubes RPC (127.0.0.1:41622).
[2025-11-05 09:10:45] [ 43.169579] systemd[1]: Started [email protected]:8082-127.0.0.1:41630.service - Forward connection to updates proxy over Qubes RPC (127.0.0.1:41630).
[2025-11-05 09:10:45] [ 43.473584] systemd[1]: [email protected]:8082-127.0.0.1:41618.service: Deactivated successfully.
[2025-11-05 09:10:46] [ 44.179391] systemd[1]: [email protected]:8082-127.0.0.1:41604.service: Deactivated successfully.
[2025-11-05 09:10:46] [ 44.186461] systemd[1]: [email protected]:8082-127.0.0.1:41622.service: Deactivated successfully.
[2025-11-05 09:10:46] [ 44.190826] systemd[1]: [email protected]:8082-127.0.0.1:41630.service: Deactivated successfully.
[2025-11-05 09:10:46] [ 44.485649] qubes-gui[871]: Ok, somebody connected.
[2025-11-05 09:10:46] [ 44.655200] qrexec-agent[890]: pam_unix(qrexec:session): session closed for user user
[2025-11-05 09:10:46] [ 44.665597] systemd[1]: session-c2.scope: Deactivated successfully.
[2025-11-05 09:10:46] [ 44.673756] qrexec-agent[888]: 2025-11-05 09:10:46.957 qrexec-agent[888]: qrexec-agent-data.c:309:handle_new_process_common: pid 890 exited with 0
[2025-11-05 09:10:47] [ 44.756270] cpp invoked oom-killer: gfp_mask=0x140dca(GFP_HIGHUSER_MOVABLE|__GFP_COMP|__GFP_ZERO), order=0, oom_score_adj=0
[2025-11-05 09:10:47] [ 44.756735] CPU: 1 UID: 1000 PID: 1092 Comm: cpp Not tainted 6.12.54-1.qubes.fc41.x86_64 #1
[2025-11-05 09:10:47] [ 44.756744] Call Trace:
[2025-11-05 09:10:47] [ 44.756749] <TASK>
[2025-11-05 09:10:47] [ 44.756752] dump_stack_lvl+0x5d/0x80
[2025-11-05 09:10:47] [ 44.756757] dump_header+0x43/0x1aa
[2025-11-05 09:10:47] [ 44.756761] oom_kill_process.cold+0xa/0xae
[2025-11-05 09:10:47] [ 44.756764] out_of_memory+0xf3/0x290
[2025-11-05 09:10:47] [ 44.756769] __alloc_pages_slowpath.constprop.0+0x983/0xb10
[2025-11-05 09:10:47] [ 44.756774] __alloc_pages_noprof+0x316/0x330
[2025-11-05 09:10:47] [ 44.756778] alloc_pages_mpol_noprof+0xd7/0x1e0
[2025-11-05 09:10:47] [ 44.756780] folio_alloc_mpol_noprof+0x14/0x30
[2025-11-05 09:10:47] [ 44.756782] vma_alloc_folio_noprof+0x69/0xb0
[2025-11-05 09:10:47] [ 44.756784] alloc_anon_folio+0x1f9/0x430
[2025-11-05 09:10:47] [ 44.756787] do_anonymous_page+0x12b/0x4a0
[2025-11-05 09:10:47] [ 44.756790] __handle_mm_fault+0x39a/0x6f0
[2025-11-05 09:10:47] [ 44.756792] ? mt_find+0x213/0x570
[2025-11-05 09:10:47] [ 44.756797] handle_mm_fault+0x115/0x330
[2025-11-05 09:10:47] [ 44.756799] do_user_addr_fault+0x2ca/0x7b0
[2025-11-05 09:10:47] [ 44.756803] exc_page_fault+0x7e/0x180
[2025-11-05 09:10:47] [ 44.756807] asm_exc_page_fault+0x26/0x30
[2025-11-05 09:10:47] [ 44.756810] RIP: 0033:0x7fcc488800f4
[2025-11-05 09:10:47] [ 44.756812] Code: 59 14 00 48 8d 0c 1e 49 39 d0 49 89 48 60 0f 95 c2 48 29 d8 0f b6 d2 48 83 c8 01 48 c1 e2 02 48 09 da 48 83 ca 01 48 89 56 08 <48> 89 41 08 48 8d 4e 10 eb ae 48 8d 0d 3b cf 10 00 ba 9e 10 00 00
[2025-11-05 09:10:47] [ 44.756814] RSP: 002b:00007ffdbb696c40 EFLAGS: 00010202
[2025-11-05 09:10:47] [ 44.756816] RAX: 000000000001f4e1 RBX: 0000000000001450 RCX: 000000002a1eeb20
[2025-11-05 09:10:47] [ 44.756818] RDX: 0000000000001451 RSI: 000000002a1ed6d0 RDI: 0000000000000004
[2025-11-05 09:10:47] [ 44.756819] RBP: ffffffffffffffb0 R08: 00007fcc489c5ac0 R09: 0000000000000001
[2025-11-05 09:10:47] [ 44.756820] R10: 00007fcc489c5ac0 R11: 0000000000000143 R12: 0000000000001440
[2025-11-05 09:10:47] [ 44.756821] R13: 0000000000000000 R14: 0000000000000143 R15: 00007fcc489c5b20
[2025-11-05 09:10:47] [ 44.756824] </TASK>
[2025-11-05 09:10:47] [ 44.756825] Mem-Info:
[2025-11-05 09:10:47] [ 44.762705] active_anon:19375 inactive_anon:893 isolated_anon:0
[2025-11-05 09:10:47] [ 44.762705] active_file:30901 inactive_file:20225 isolated_file:0
[2025-11-05 09:10:47] [ 44.762705] unevictable:0 dirty:1380 writeback:0
[2025-11-05 09:10:47] [ 44.762705] slab_reclaimable:4089 slab_unreclaimable:11886
[2025-11-05 09:10:47] [ 44.762705] mapped:23568 shmem:236 pagetables:830
[2025-11-05 09:10:47] [ 44.762705] sec_pagetables:0 bounce:0
[2025-11-05 09:10:47] [ 44.762705] kernel_misc_reclaimable:0
[2025-11-05 09:10:47] [ 44.762705] free:2941 free_pcp:114 free_cma:0
[2025-11-05 09:10:47] [ 44.764495] Node 0 active_anon:77500kB inactive_anon:3156kB active_file:125896kB inactive_file:75888kB unevictable:0kB isolated(anon):0kB isolated(file):0kB mapped:94272kB dirty:5520kB writeback:0kB shmem:944kB shmem_thp:0kB shmem_pmdmapped:0kB anon_thp:0kB writeback_tmp:0kB kernel_stack:2640kB pagetables:3320kB sec_pagetables:0kB all_unreclaimable? no
[2025-11-05 09:10:47] [ 44.764909] Node 0 DMA free:896kB boost:0kB min:4kB low:4kB high:4kB reserved_highatomic:0KB active_anon:0kB inactive_anon:0kB active_file:0kB inactive_file:0kB unevictable:0kB writepending:0kB present:15996kB managed:1024kB mlocked:0kB bounce:0kB free_pcp:0kB local_pcp:0kB free_cma:0kB
[2025-11-05 09:10:47] [ 44.766059] lowmem_reserve[]: 0 341 341 341 341
[2025-11-05 09:10:47] [ 44.766287] Node 0 DMA32 free:15280kB boost:5272kB min:7620kB low:8204kB high:8788kB reserved_highatomic:0KB active_anon:56680kB inactive_anon:3276kB active_file:96968kB inactive_file:56960kB unevictable:0kB writepending:4348kB present:393216kB managed:349516kB mlocked:0kB bounce:0kB free_pcp:380kB local_pcp:220kB free_cma:0kB
[2025-11-05 09:10:47] [ 44.766662] lowmem_reserve[]: 0 0 0 0 0
[2025-11-05 09:10:47] [ 44.766732] Node 0 Normal free:28kB boost:0kB min:0kB low:0kB high:0kB reserved_highatomic:0KB active_anon:20836kB inactive_anon:0kB active_file:31964kB inactive_file:15080kB unevictable:0kB writepending:1104kB present:131072kB managed:68988kB mlocked:0kB bounce:0kB free_pcp:0kB local_pcp:0kB free_cma:0kB
[2025-11-05 09:10:47] [ 44.768014] lowmem_reserve[]: 0 0 0 0 0
[2025-11-05 09:10:47] [ 44.768075] Node 0 DMA: 0*4kB 0*8kB 0*16kB 0*32kB 0*64kB 1*128kB (U) 1*256kB (U) 1*512kB (U) 0*1024kB 0*2048kB 0*4096kB = 896kB
[2025-11-05 09:10:47] [ 44.769457] Node 0 DMA32: 1566*4kB (UME) 391*8kB (UME) 155*16kB (UME) 64*32kB (UME) 30*64kB (UME) 8*128kB (UME) 1*256kB (U) 0*512kB 0*1024kB 0*2048kB 0*4096kB = 17120kB
[2025-11-05 09:10:47] [ 44.769651] Node 0 Normal: 0*4kB 0*8kB 0*16kB 0*32kB 0*64kB 0*128kB 0*256kB 0*512kB 0*1024kB 0*2048kB 0*4096kB = 0kB
[2025-11-05 09:10:47] [ 44.769799] Node 0 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=1048576kB
[2025-11-05 09:10:47] [ 44.769942] Node 0 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=2048kB
[2025-11-05 09:10:47] [ 44.771523] 50085 total pagecache pages
[2025-11-05 09:10:47] [ 44.771579] 6 pages in swap cache
[2025-11-05 09:10:47] [ 44.771628] Free swap = 1047804kB
[2025-11-05 09:10:47] [ 44.771696] Total swap = 1048572kB
[2025-11-05 09:10:47] [ 44.771905] 135071 pages RAM
[2025-11-05 09:10:47] [ 44.771957] 0 pages HighMem/MovableOnly
[2025-11-05 09:10:47] [ 44.772039] 30189 pages reserved
[2025-11-05 09:10:47] [ 44.772089] 0 pages cma reserved
[2025-11-05 09:10:47] [ 44.774282] 0 pages hwpoisoned
[2025-11-05 09:10:47] [ 44.774338] Tasks state (memory values in pages):
[2025-11-05 09:10:47] [ 44.774952] [ pid ] uid tgid total_vm rss rss_anon rss_file rss_shmem pgtables_bytes swapents oom_score_adj name
[2025-11-05 09:10:47] [ 44.775108] [ 296] 0 296 10689 4032 288 3743 1 102400 0 -250 systemd-journal
[2025-11-05 09:10:47] [ 44.775716] [ 339] 0 339 17728 772 64 708 0 61440 0 0 qubesdb-daemon
[2025-11-05 09:10:47] [ 44.775862] [ 357] 0 357 9256 2839 832 2007 0 94208 0 -1000 systemd-udevd
[2025-11-05 09:10:47] [ 44.776381] [ 636] 0 636 2095 1214 785 429 0 57344 0 0 haveged
[2025-11-05 09:10:47] [ 44.776529] [ 638] 990 638 2068 1108 192 916 0 53248 0 -900 dbus-daemon
[2025-11-05 09:10:47] [ 44.776662] [ 642] 988 642 76638 1933 192 1741 0 94208 0 0 polkitd
[2025-11-05 09:10:47] [ 44.776818] [ 647] 0 647 4666 2225 288 1937 0 73728 0 0 systemd-logind
[2025-11-05 09:10:47] [ 44.777800] [ 649] 0 649 656 353 0 353 0 45056 0 0 meminfo-writer
[2025-11-05 09:10:47] [ 44.777949] [ 674] 0 674 4380 1650 224 1426 0 73728 0 0 wpa_supplicant
[2025-11-05 09:10:47] [ 44.778092] [ 676] 0 676 17820 585 64 521 0 61440 0 0 xl
[2025-11-05 09:10:47] [ 44.778363] [ 854] 0 854 2040 666 32 634 0 57344 0 0 agetty
[2025-11-05 09:10:47] [ 44.778524] [ 859] 0 859 1943 736 32 704 0 61440 0 0 agetty
[2025-11-05 09:10:47] [ 44.778657] [ 864] 0 864 861 575 32 543 0 49152 0 0 qrexec-agent
[2025-11-05 09:10:47] [ 44.778845] [ 871] 0 871 1659 801 64 737 0 53248 0 0 qubes-gui
[2025-11-05 09:10:47] [ 44.779688] [ 884] 0 884 5636 3225 608 2617 0 90112 0 100 systemd
[2025-11-05 09:10:47] [ 44.959218] [ 887] 0 887 5612 858 409 449 0 73728 0 100 (sd-pam)
[2025-11-05 09:10:47] [ 44.959372] [ 894] 0 894 1884 1019 128 891 0 61440 0 0 qubes-gui-runus
[2025-11-05 09:10:47] [ 44.959544] [ 907] 1000 907 5681 3233 608 2625 0 90112 0 100 systemd
[2025-11-05 09:10:47] [ 44.959683] [ 912] 1000 912 5644 865 414 451 0 77824 0 100 (sd-pam)
[2025-11-05 09:10:47] [ 44.959835] [ 947] 0 947 1950 1023 64 959 0 61440 0 100 dbus-daemon
[2025-11-05 09:10:47] [ 44.959970] [ 951] 1000 951 1950 1055 96 959 0 61440 0 200 dbus-daemon
[2025-11-05 09:10:47] [ 44.960130] [ 953] 1000 953 22892 2519 512 2007 0 94208 0 200 pipewire
[2025-11-05 09:10:47] [ 44.960748] [ 956] 1000 956 1025 557 32 525 0 53248 0 0 xinit
[2025-11-05 09:10:47] [ 44.960887] [ 957] 1000 957 2720 1206 96 1110 0 61440 0 200 pipewire
[2025-11-05 09:10:47] [ 44.961011] [ 959] 1000 959 803 561 32 529 0 49152 0 200 qrexec-client-v
[2025-11-05 09:10:47] [ 44.963223] [ 961] 1000 961 81361 3385 672 2713 0 126976 0 200 wireplumber
[2025-11-05 09:10:47] [ 44.963361] [ 964] 1000 964 6258 2058 224 1834 0 77824 0 200 pipewire-pulse
[2025-11-05 09:10:47] [ 44.963550] [ 966] 1000 966 18478 1049 64 985 0 69632 0 200 qubes-notificat
[2025-11-05 09:10:47] [ 44.963705] [ 978] 1000 978 66998 15608 3773 11835 0 323584 0 0 Xorg
[2025-11-05 09:10:47] [ 44.963858] [ 1019] 0 1019 17332 694 64 630 0 53248 0 0 qrexec-agent
[2025-11-05 09:10:47] [ 44.964337] [ 1021] 0 1021 18364 1053 129 924 0 69632 0 0 qrexec-agent
[2025-11-05 09:10:47] [ 44.964503] [ 1022] 0 1022 1928 657 32 625 0 61440 0 0 logger
[2025-11-05 09:10:47] [ 44.964653] [ 1023] 0 1023 5924 4021 2162 1859 0 90112 0 0 python3
[2025-11-05 09:10:47] [ 44.964795] [ 1027] 0 1027 9257 1654 862 792 0 69632 0 0 (udev-worker)
[2025-11-05 09:10:47] [ 44.964950] [ 1028] 0 1028 12655 7213 4544 2669 0 131072 0 0 apt-get
[2025-11-05 09:10:47] [ 44.965094] [ 1065] 1000 1065 670 460 32 428 0 45056 0 0 Xsession
[2025-11-05 09:10:47] [ 44.966277] [ 1088] 1000 1088 1934 999 64 935 0 57344 0 0 xsettingsd
[2025-11-05 09:10:47] [ 44.966501] [ 1090] 1000 1090 1039 567 32 535 0 49152 0 0 xrdb
[2025-11-05 09:10:47] [ 44.967684] [ 1091] 1000 1091 670 407 0 407 0 45056 0 0 sh
[2025-11-05 09:10:47] [ 45.157848] [ 1092] 1000 1092 958 389 0 389 0 45056 0 0 cpp
[2025-11-05 09:10:47] [ 45.157991] oom-kill:constraint=CONSTRAINT_NONE,nodemask=(null),cpuset=/,mems_allowed=0,global_oom,task_memcg=/user.slice/user-1000.slice/[email protected]/session.slice/wireplumber.service,task=wireplumber,pid=961,uid=1000
[2025-11-05 09:10:47] [ 45.158335] Out of memory: Killed process 961 (wireplumber) total-vm:325444kB, anon-rss:2688kB, file-rss:10852kB, shmem-rss:0kB, UID:1000 pgtables:124kB oom_score_adj:200
[2025-11-05 09:10:47] [ 45.149841] systemd[907]: [0;1;39mwireplumber.service: A process of this unit has been killed by the OOM killer.[0m
[2025-11-05 09:10:47] [ 45.150141] systemd[1]: [0;1;[email protected]: A process of this unit has been killed by the OOM killer.[0m
[2025-11-05 09:10:47] [ 45.170155] systemd[907]: [0;1;38:5:185mwireplumber.service: Main process exited, code=killed, status=9/KILL[0m
[2025-11-05 09:10:47] [ 45.170686] systemd[907]: [0;1;38:5:185mwireplumber.service: Failed with result 'oom-kill'.[0m
[2025-11-05 09:10:47] [ 45.481667] systemd[907]: wireplumber.service: Scheduled restart job, restart counter is at 1.
[2025-11-05 09:10:47] [ 45.486514] systemd[907]: Started wireplumber.service - Multimedia Service Session Manager.
[2025-11-05 09:10:47] [ 45.623964] wireplumber[1100]: [0;1;39mwp-internal-comp-loader: Loading profile 'main'[0m
[2025-11-05 09:10:48] [ 46.242882] wireplumber[1100]: [0;1;39mwp-device: SPA handle 'api.libcamera.enum.manager' could not be loaded; is it installed?[0m
[2025-11-05 09:10:48] [ 46.243359] wireplumber[1100]: [0;1;39ms-monitors-libcamera: PipeWire's libcamera SPA plugin is missing or broken. Some camera types may not be supported.[0m
[2025-11-05 09:10:48] [ 46.581419] wireplumber[1100]: [0;1;39mwp-device: SPA handle 'api.bluez5.enum.dbus' could not be loaded; is it installed?[0m
[2025-11-05 09:10:48] [ 46.581713] wireplumber[1100]: [0;1;39ms-monitors: PipeWire's BlueZ SPA plugin is missing or broken. Bluetooth devices will not be supported.[0m
[2025-11-05 09:10:48] [ 46.581917] wireplumber[1100]: [0;1;39mwp-device: SPA handle 'api.bluez5.midi.enum' could not be loaded; is it installed?[0m
[2025-11-05 09:10:48] [ 46.588920] wireplumber[1100]: [0;1;39ms-monitors: PipeWire's BlueZ MIDI SPA missing or broken. Bluetooth not supported.[0m
[2025-11-05 09:10:48] [ 46.589150] wireplumber[1100]: [0;1;38:5:185mpw.resource: can't create node: No such file or directory[0m
[2025-11-05 09:10:48] [ 46.589324] wireplumber[1100]: [0;1;38:5:185mwp-node: failed to create node from factory 'spa-node-factory'[0m
[2025-11-05 09:10:48] [ 46.589479] wireplumber[1100]: [0;1;39ms-monitors: Failed to create BLE MIDI server.[0m
[2025-11-05 09:10:49] [ 46.766382] xen-balloon: vmemmap alloc failure: order:9, mode:0x4cc0(GFP_KERNEL|__GFP_RETRY_MAYFAIL), nodemask=(null),cpuset=/,mems_allowed=0
[2025-11-05 09:10:49] [ 46.766599] CPU: 1 UID: 0 PID: 67 Comm: xen-balloon Not tainted 6.12.54-1.qubes.fc41.x86_64 #1
[2025-11-05 09:10:49] [ 46.766604] Call Trace:
[2025-11-05 09:10:49] [ 46.766607] <TASK>
[2025-11-05 09:10:49] [ 46.766611] dump_stack_lvl+0x5d/0x80
[2025-11-05 09:10:49] [ 46.766616] warn_alloc+0x164/0x190
[2025-11-05 09:10:49] [ 46.766623] vmemmap_alloc_block+0xd2/0xe0
[2025-11-05 09:10:49] [ 46.766627] vmemmap_populate_hugepages+0xef/0x1a0
[2025-11-05 09:10:49] [ 46.766630] vmemmap_populate+0x54/0x90
[2025-11-05 09:10:49] [ 46.766634] __populate_section_memmap+0xc5/0xe0
[2025-11-05 09:10:49] [ 46.766637] section_activate+0x14c/0x270
[2025-11-05 09:10:49] [ 46.766640] sparse_add_section+0x3d/0x150
[2025-11-05 09:10:49] [ 46.766643] __add_pages+0x9d/0x120
[2025-11-05 09:10:49] [ 46.766647] add_pages+0x28/0x80
[2025-11-05 09:10:49] [ 46.766650] add_memory_resource+0xfc/0x2c0
[2025-11-05 09:10:49] [ 46.766653] reserve_additional_memory+0x136/0x180
[2025-11-05 09:10:49] [ 46.766658] balloon_thread+0xd6/0x490
[2025-11-05 09:10:49] [ 46.766662] ? __pfx_autoremove_wake_function+0x10/0x10
[2025-11-05 09:10:49] [ 46.766666] ? __pfx_balloon_thread+0x10/0x10
[2025-11-05 09:10:49] [ 46.766684] kthread+0xd2/0x100
[2025-11-05 09:10:49] [ 46.766687] ? __pfx_kthread+0x10/0x10
[2025-11-05 09:10:49] [ 46.766690] ret_from_fork+0x34/0x50
[2025-11-05 09:10:49] [ 46.766693] ? __pfx_kthread+0x10/0x10
[2025-11-05 09:10:49] [ 46.766695] ret_from_fork_asm+0x1a/0x30
[2025-11-05 09:10:49] [ 46.766715] </TASK>
[2025-11-05 09:10:49] [ 46.766716] Mem-Info:
[2025-11-05 09:10:49] [ 46.774463] active_anon:18515 inactive_anon:9552 isolated_anon:0
[2025-11-05 09:10:49] [ 46.774463] active_file:20545 inactive_file:25451 isolated_file:0
[2025-11-05 09:10:49] [ 46.774463] unevictable:0 dirty:1642 writeback:624
[2025-11-05 09:10:49] [ 46.774463] slab_reclaimable:4091 slab_unreclaimable:11907
[2025-11-05 09:10:49] [ 46.774463] mapped:23096 shmem:169 pagetables:843
[2025-11-05 09:10:49] [ 46.774463] sec_pagetables:0 bounce:0
[2025-11-05 09:10:49] [ 46.774463] kernel_misc_reclaimable:0
[2025-11-05 09:10:49] [ 46.774463] free:15896 free_pcp:51 free_cma:0
[2025-11-05 09:10:49] [ 46.776558] Node 0 active_anon:74060kB inactive_anon:38208kB active_file:82180kB inactive_file:101804kB unevictable:0kB isolated(anon):0kB isolated(file):0kB mapped:92436kB dirty:6620kB writeback:2496kB shmem:676kB shmem_thp:0kB shmem_pmdmapped:0kB anon_thp:0kB writeback_tmp:0kB kernel_stack:2608kB pagetables:3372kB sec_pagetables:0kB all_unreclaimable? no
[2025-11-05 09:10:49] [ 46.776982] Node 0 DMA free:896kB boost:0kB min:4kB low:4kB high:4kB reserved_highatomic:0KB active_anon:0kB inactive_anon:0kB active_file:0kB inactive_file:0kB unevictable:0kB writepending:0kB present:15996kB managed:1024kB mlocked:0kB bounce:0kB free_pcp:0kB local_pcp:0kB free_cma:0kB
[2025-11-05 09:10:49] [ 46.778578] lowmem_reserve[]: 0 341 341 341 341
[2025-11-05 09:10:49] [ 46.778663] Node 0 DMA32 free:12216kB boost:0kB min:2348kB low:2932kB high:3516kB reserved_highatomic:0KB active_anon:41656kB inactive_anon:37892kB active_file:42520kB inactive_file:94856kB unevictable:0kB writepending:7964kB present:393216kB managed:349516kB mlocked:0kB bounce:0kB free_pcp:204kB local_pcp:0kB free_cma:0kB
[2025-11-05 09:10:49] [ 46.779020] lowmem_reserve[]: 0 0 0 0 0
[2025-11-05 09:10:49] [ 46.779081] Node 0 Normal free:50224kB boost:0kB min:0kB low:0kB high:0kB reserved_highatomic:0KB active_anon:32700kB inactive_anon:232kB active_file:39640kB inactive_file:7016kB unevictable:0kB writepending:1120kB present:131072kB managed:131072kB mlocked:0kB bounce:0kB free_pcp:0kB local_pcp:0kB free_cma:0kB
[2025-11-05 09:10:49] [ 46.779769] lowmem_reserve[]: 0 0 0 0 0
[2025-11-05 09:10:49] [ 46.779841] Node 0 DMA: 0*4kB 0*8kB 0*16kB 0*32kB 0*64kB 1*128kB (U) 1*256kB (U) 1*512kB (U) 0*1024kB 0*2048kB 0*4096kB = 896kB
[2025-11-05 09:10:49] [ 46.780012] Node 0 DMA32: 842*4kB (UME) 269*8kB (UME) 71*16kB (UM) 53*32kB (UM) 44*64kB (UM) 7*128kB (UM) 1*256kB (U) 0*512kB 0*1024kB 0*2048kB 0*4096kB = 12320kB
[2025-11-05 09:10:49] [ 46.781627] Node 0 Normal: 0*4kB 2*8kB (UM) 2*16kB (UM) 19*32kB (UM) 94*64kB (UM) 96*128kB (M) 60*256kB (UM) 25*512kB (M) 3*1024kB (UM) 0*2048kB 0*4096kB = 50192kB
[2025-11-05 09:10:49] [ 46.781840] Node 0 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=1048576kB
[2025-11-05 09:10:49] [ 46.781951] Node 0 hugepages_total=0 hugepages_free=0 hugepages_surp=0 hugepages_size=2048kB
[2025-11-05 09:10:49] [ 46.782077] 46946 total pagecache pages
[2025-11-05 09:10:49] [ 46.783563] 780 pages in swap cache
[2025-11-05 09:10:49] [ 46.783616] Free swap = 1042940kB
[2025-11-05 09:10:49] [ 46.783667] Total swap = 1048572kB
[2025-11-05 09:10:49] [ 46.783734] 135071 pages RAM
[2025-11-05 09:10:49] [ 46.783783] 0 pages HighMem/MovableOnly
[2025-11-05 09:10:49] [ 46.783846] 14668 pages reserved
[2025-11-05 09:10:49] [ 46.783893] 0 pages cma reserved
[2025-11-05 09:10:49] [ 46.783940] 0 pages hwpoisoned
@ben-grande Possibly relevant, Thunderbird is no longer installed on Whonix-Workstation by default, so I'd guess that's why the test that searches for a Thunderbird window fails.