qubes-core-admin
qubes-core-admin copied to clipboard
[WIP] Preload disposables
For: https://github.com/QubesOS/qubes-issues/issues/1512
Untested and incomplete.
Removed draft but still not ready yet, but Gitlab CI fails to fetch amended and force pushed commits when using the Github API.
Having problems with events not being fired, trying to understand it.
This didn't went far:
Mar 19 17:31:15.017096 dom0 qubesd[4569]: unhandled exception while calling src=b'dom0' meth=b'admin.vm.Start' dest=b'sys-firewall' arg=b'' len(untrusted_payload)=0
Mar 19 17:31:15.017096 dom0 qubesd[4569]: Traceback (most recent call last):
Mar 19 17:31:15.017096 dom0 qubesd[4569]: File "/usr/lib/python3.13/site-packages/qubes/api/__init__.py", line 333, in respond
Mar 19 17:31:15.017096 dom0 qubesd[4569]: response = await self.mgmt.execute(
Mar 19 17:31:15.017096 dom0 qubesd[4569]: ^^^^^^^^^^^^^^^^^^^^^^^^
Mar 19 17:31:15.017096 dom0 qubesd[4569]: untrusted_payload=untrusted_payload
Mar 19 17:31:15.017096 dom0 qubesd[4569]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Mar 19 17:31:15.017096 dom0 qubesd[4569]: )
Mar 19 17:31:15.017096 dom0 qubesd[4569]: ^
Mar 19 17:31:15.017096 dom0 qubesd[4569]: File "/usr/lib/python3.13/site-packages/qubes/api/admin.py", line 947, in vm_start
Mar 19 17:31:15.017096 dom0 qubesd[4569]: await self.dest.start()
Mar 19 17:31:15.017096 dom0 qubesd[4569]: File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 426, in start
Mar 19 17:31:15.017096 dom0 qubesd[4569]: await super().start(**kwargs)
Mar 19 17:31:15.017096 dom0 qubesd[4569]: File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 1522, in start
Mar 19 17:31:15.017096 dom0 qubesd[4569]: await self.fire_event_async(
Mar 19 17:31:15.017096 dom0 qubesd[4569]: "domain-start", start_guid=start_guid
Mar 19 17:31:15.017096 dom0 qubesd[4569]: )
Mar 19 17:31:15.017096 dom0 qubesd[4569]: File "/usr/lib/python3.13/site-packages/qubes/events.py", line 234, in fire_event_async
Mar 19 17:31:15.017096 dom0 qubesd[4569]: sync_effects, async_effects = self._fire_event(
Mar 19 17:31:15.017096 dom0 qubesd[4569]: ~~~~~~~~~~~~~~~~^
Mar 19 17:31:15.017096 dom0 qubesd[4569]: event, kwargs, pre_event=pre_event
Mar 19 17:31:15.017096 dom0 qubesd[4569]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Mar 19 17:31:15.017096 dom0 qubesd[4569]: )
Mar 19 17:31:15.017096 dom0 qubesd[4569]: ^
Mar 19 17:31:15.017096 dom0 qubesd[4569]: File "/usr/lib/python3.13/site-packages/qubes/events.py", line 169, in _fire_event
Mar 19 17:31:15.017096 dom0 qubesd[4569]: effect = func(self, event, **kwargs)
Mar 19 17:31:15.017096 dom0 qubesd[4569]: TypeError: DispVM.on_domain_started_dispvm() got an unexpected keyword argument 'start_guid'
And this:
Mar 19 17:31:05.021673 dom0 qubesd[4569]: vm.sys-firewall: Activating the sys-firewall VM
Mar 19 17:31:05.040131 dom0 qubesd[4569]: Uncaught exception from domain-unpaused handler for domain sys-firewall
Mar 19 17:31:05.040131 dom0 qubesd[4569]: Traceback (most recent call last):
Mar 19 17:31:05.040131 dom0 qubesd[4569]: File "/usr/lib/python3.13/site-packages/qubes/app.py", line 1624, in _domain_event_callback
Mar 19 17:31:05.040131 dom0 qubesd[4569]: vm.fire_event("domain-unpaused")
Mar 19 17:31:05.040131 dom0 qubesd[4569]: ~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^
Mar 19 17:31:05.040131 dom0 qubesd[4569]: File "/usr/lib/python3.13/site-packages/qubes/events.py", line 200, in fire_event
Mar 19 17:31:05.040131 dom0 qubesd[4569]: sync_effects, async_effects = self._fire_event(
Mar 19 17:31:05.040131 dom0 qubesd[4569]: ~~~~~~~~~~~~~~~~^
Mar 19 17:31:05.040131 dom0 qubesd[4569]: event, kwargs, pre_event=pre_event
Mar 19 17:31:05.040131 dom0 qubesd[4569]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Mar 19 17:31:05.040131 dom0 qubesd[4569]: )
Mar 19 17:31:05.040131 dom0 qubesd[4569]: ^
Mar 19 17:31:05.040131 dom0 qubesd[4569]: File "/usr/lib/python3.13/site-packages/qubes/events.py", line 169, in _fire_event
Mar 19 17:31:05.040131 dom0 qubesd[4569]: effect = func(self, event, **kwargs)
Mar 19 17:31:05.040131 dom0 qubesd[4569]: TypeError: DispVM.on_domain_unpaused() takes 1 positional argument but 2 were given
OpenQA test summary
Complete test suite and dependencies: https://openqa.qubes-os.org/tests/overview?distri=qubesos&version=4.3&build=2025060710-4.3&flavor=pull-requests
Test run included the following:
- https://github.com/QubesOS/qubes-desktop-linux-menu/pull/55 (https://github.com/QubesOS/qubes-desktop-linux-menu/pull/55/commits/0dc860f59edf414c61248456af8ffde7caab5bc2)
- https://github.com/QubesOS/qubes-app-linux-split-gpg2/pull/24 (https://github.com/QubesOS/qubes-app-linux-split-gpg2/pull/24/commits/45a239db67472272376b0694ad56bd61a885fbeb)
- https://github.com/QubesOS/qubes-linux-pvgrub2/pull/16 (https://github.com/QubesOS/qubes-linux-pvgrub2/pull/16/commits/ea2ec7ca1dc5a3a897f84aa6be1b4a2871e18b25)
- https://github.com/QubesOS/qubes-gui-agent-linux/pull/227 (https://github.com/QubesOS/qubes-gui-agent-linux/pull/227/commits/ad0798e3855cb9f53501b83687ec14480fa7113e)
- https://github.com/QubesOS/qubes-manager/pull/425 (https://github.com/QubesOS/qubes-manager/pull/425/commits/207745bdbfb8bbada255e8806233f82d2481eba0)
- https://github.com/QubesOS/qubes-gui-agent-linux/pull/231 (https://github.com/QubesOS/qubes-gui-agent-linux/pull/231/commits/e89207cd80e76a54348e6dcd2ea1bead85b620c3)
- https://github.com/QubesOS/qubes-gui-daemon/pull/163 (https://github.com/QubesOS/qubes-gui-daemon/pull/163/commits/3a598114bf8e786ae6d936564a372bb002fdfd20)
- https://github.com/QubesOS/qubes-core-admin-client/pull/332 (https://github.com/QubesOS/qubes-core-admin-client/pull/332/commits/40487bcb59ff4a704248144f4783ce0b0e1532b8)
- https://github.com/QubesOS/qubes-xscreensaver/pull/18 (https://github.com/QubesOS/qubes-xscreensaver/pull/18/commits/3fd612a46e3ba93044ea0cfa6256a1e4edb73e94)
- https://github.com/QubesOS/qubes-manager/pull/398 (https://github.com/QubesOS/qubes-manager/pull/398/commits/679eff400bd0de4887f0bb61de34250a1f842709)
- https://github.com/QubesOS/qubes-gui-daemon/pull/155 (https://github.com/QubesOS/qubes-gui-daemon/pull/155/commits/a993c9de153481f6b9dc6971440b9d3da6fa9799)
- https://github.com/QubesOS/qubes-installer-qubes-os-windows-tools/pull/6 (https://github.com/QubesOS/qubes-installer-qubes-os-windows-tools/pull/6/commits/0b6a52acf4be881a23255fc210019be72bdfd702)
- https://github.com/QubesOS/qubes-core-agent-linux/pull/571 (https://github.com/QubesOS/qubes-core-agent-linux/pull/571/commits/f9a21b65a5f37f2f48251f5e31f85707759bfcc4)
- https://github.com/QubesOS/qubes-mgmt-salt-base-topd/pull/15 (https://github.com/QubesOS/qubes-mgmt-salt-base-topd/pull/15/commits/44d09b1dbe0918c9013c301bf556900419f45bd6)
- https://github.com/QubesOS/qubes-desktop-linux-manager/pull/258 (https://github.com/QubesOS/qubes-desktop-linux-manager/pull/258/commits/b237806c1c8aa15ef95e0cdaec199c3f63fb4a11)
- https://github.com/QubesOS/qubes-core-admin/pull/660 (https://github.com/QubesOS/qubes-core-admin/pull/660/commits/78aacd1f9e6f2c80ca0e693d42eff36a0281e151)
- https://github.com/QubesOS/qubes-core-admin/pull/683 (https://github.com/QubesOS/qubes-core-admin/pull/683/commits/819093859c781d788424c0327327df475c2a73c2)
- https://github.com/QubesOS/qubes-mgmt-salt-dom0-virtual-machines/pull/72 (https://github.com/QubesOS/qubes-mgmt-salt-dom0-virtual-machines/pull/72/commits/5999317aac637390e351eca19e53c747ce7bf4d6)
New failures, excluding unstable
Compared to: https://openqa.qubes-os.org/tests/overview?distri=qubesos&version=4.3&build=2025031804-4.3&flavor=update
-
system_tests_splitgpg
-
TC_10_Thunderbird_whonix-workstation-17: test_000_send_receive_default (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_10_Thunderbird_whonix-workstation-17: test_010_send_receive_inline_signed_only (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_10_Thunderbird_whonix-workstation-17: test_020_send_receive_inline_with_attachment (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]...
-
-
system_tests_extra
-
TC_10_Thunderbird_whonix-workstation-17: test_000_send_receive_default (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_10_Thunderbird_whonix-workstation-17: test_010_send_receive_inline_signed_only (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_10_Thunderbird_whonix-workstation-17: test_020_send_receive_inline_with_attachment (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_00_QVCTest_whonix-workstation-17: test_010_screenshare (failure)
AssertionError: 1 != 0 : Timeout waiting for /dev/video0 in test-in...
-
-
system_tests_network_updates
- TC_00_Dom0Upgrade_whonix-gateway-17: test_001_update_check (failure)
^... AssertionError: '' is not true
- TC_00_Dom0Upgrade_whonix-gateway-17: test_001_update_check (failure)
-
system_tests_devices
- TC_00_List_whonix-workstation-17: test_011_list_dm_mounted (failure)
AssertionError: 'test-dm' == 'test-dm' : Device test-inst-vm:dm-0::...
- TC_00_List_whonix-workstation-17: test_011_list_dm_mounted (failure)
-
system_tests_kde_gui_interactive
-
gui_keyboard_layout: wait_serial (wait serial expected)
# wait_serial expected: "echo -e '[Layout]\nLayoutList=us,de' | sud... -
gui_keyboard_layout: Failed (test died)
# Test died: command 'test "$(cd ~user;ls e1*)" = "$(qvm-run -p wor...
-
-
system_tests_qwt_win10_seamless@hw13
- windows_clipboard_and_filecopy: unnamed test (unknown)
- windows_clipboard_and_filecopy: Failed (test died)
# Test died: no candidate needle with tag(s) 'windows-Edge-address-...
-
system_tests_qwt_win11@hw13
-
windows_install: wait_serial (wait serial expected)
# wait_serial expected: qr/dcWzE-\d+-/... -
windows_install: Failed (test died + timed out)
# Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...
-
-
system_tests_qwt_win10@hw13
- windows_install: Failed (test died)
# Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...
- windows_install: Failed (test died)
-
system_tests_guivm_vnc_gui_interactive
- clipboard_and_web: unnamed test (unknown)
- clipboard_and_web: Failed (test died)
# Test died: no candidate needle with tag(s) 'personal-firefox' mat...
Failed tests
18 failures
-
system_tests_splitgpg
-
TC_10_Thunderbird_whonix-workstation-17: test_000_send_receive_default (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_10_Thunderbird_whonix-workstation-17: test_010_send_receive_inline_signed_only (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_10_Thunderbird_whonix-workstation-17: test_020_send_receive_inline_with_attachment (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]...
-
-
system_tests_extra
-
TC_10_Thunderbird_whonix-workstation-17: test_000_send_receive_default (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_10_Thunderbird_whonix-workstation-17: test_010_send_receive_inline_signed_only (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_10_Thunderbird_whonix-workstation-17: test_020_send_receive_inline_with_attachment (failure)
dogtail.tree.SearchError: descendent of [application | Thunderbird]... -
TC_00_QVCTest_whonix-workstation-17: test_010_screenshare (failure)
AssertionError: 1 != 0 : Timeout waiting for /dev/video0 in test-in...
-
-
system_tests_network_updates
- TC_00_Dom0Upgrade_whonix-gateway-17: test_001_update_check (failure)
^... AssertionError: '' is not true
- TC_00_Dom0Upgrade_whonix-gateway-17: test_001_update_check (failure)
-
system_tests_devices
- TC_00_List_whonix-workstation-17: test_011_list_dm_mounted (failure)
AssertionError: 'test-dm' == 'test-dm' : Device test-inst-vm:dm-0::...
- TC_00_List_whonix-workstation-17: test_011_list_dm_mounted (failure)
-
system_tests_kde_gui_interactive
-
gui_keyboard_layout: wait_serial (wait serial expected)
# wait_serial expected: "echo -e '[Layout]\nLayoutList=us,de' | sud... -
gui_keyboard_layout: Failed (test died)
# Test died: command 'test "$(cd ~user;ls e1*)" = "$(qvm-run -p wor...
-
-
system_tests_qwt_win10_seamless@hw13
- windows_clipboard_and_filecopy: unnamed test (unknown)
- windows_clipboard_and_filecopy: Failed (test died)
# Test died: no candidate needle with tag(s) 'windows-Edge-address-...
-
system_tests_qwt_win11@hw13
-
windows_install: wait_serial (wait serial expected)
# wait_serial expected: qr/dcWzE-\d+-/... -
windows_install: Failed (test died + timed out)
# Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...
-
-
system_tests_qwt_win10@hw13
- windows_install: Failed (test died)
# Test died: command 'script -e -c 'bash -x /usr/bin/qvm-create-win...
- windows_install: Failed (test died)
-
system_tests_guivm_vnc_gui_interactive
- clipboard_and_web: unnamed test (unknown)
- clipboard_and_web: Failed (test died)
# Test died: no candidate needle with tag(s) 'personal-firefox' mat...
Fixed failures
Compared to: https://openqa.qubes-os.org/tests/132953#dependencies
14 fixed
-
system_tests_basic_vm_qrexec_gui
- TC_20_NonAudio_whonix-gateway-17: test_300_bug_1028_gui_memory_pinning (failure)
AssertionError: Dom0 window doesn't match VM window content, saved ...
- TC_20_NonAudio_whonix-gateway-17: test_300_bug_1028_gui_memory_pinning (failure)
-
system_tests_qrexec
- TC_00_Qrexec_fedora-41-xfce: test_081_qrexec_service_argument_allow_specific (error)
subprocess.CalledProcessError: Command '/usr/lib/qubes/qrexec-clien...
- TC_00_Qrexec_fedora-41-xfce: test_081_qrexec_service_argument_allow_specific (error)
-
system_tests_kde_gui_interactive
-
clipboard_and_web: unnamed test (unknown)
-
clipboard_and_web: Failed (test died)
# Test died: no candidate needle with tag(s) 'qubes-website' matche... -
clipboard_and_web: wait_serial (wait serial expected)
# wait_serial expected: "lspci; echo 2E8vz-\$?-"...
-
-
system_tests_audio
- TC_20_AudioVM_Pulse_whonix-workstation-17: test_252_audio_playback_audiovm_switch_hvm (failure)
AssertionError: only silence detected, no useful audio data
- TC_20_AudioVM_Pulse_whonix-workstation-17: test_252_audio_playback_audiovm_switch_hvm (failure)
-
system_tests_whonix@hw7
-
whonixcheck: fail (unknown)
Whonixcheck for sys-whonix failed... -
whonixcheck: unnamed test (unknown)
-
-
system_tests_guivm_vnc_gui_interactive
- gui_filecopy: unnamed test (unknown)
- gui_filecopy: Failed (test died)
# Test died: no candidate needle with tag(s) 'files-work' matched...
-
system_tests_suspend
- suspend: unnamed test (unknown)
- suspend: Failed (test died)
# Test died: no candidate needle with tag(s) 'SUSPEND-FAILED' match...
-
system_tests_whonix
-
whonixcheck: fail (unknown)
Whonixcheck for sys-whonix failed... -
whonixcheck: unnamed test (unknown)
-
Unstable tests
Performance Tests
Performance degradation:
16 performance degradations
- debian-12-xfce_exec: 8.41 :small_red_triangle: ( previous job: 7.12, degradation: 118.09%)
- whonix-gateway-17_socket: 8.09 :small_red_triangle: ( previous job: 7.24, degradation: 111.77%)
- dom0_root_seq1m_q8t1_read 3:read_bandwidth_kb: 334901.00 :small_red_triangle: ( previous job: 446963.00, degradation: 74.93%)
- dom0_root_seq1m_q1t1_read 3:read_bandwidth_kb: 123445.00 :small_red_triangle: ( previous job: 294295.00, degradation: 41.95%)
- dom0_root_seq1m_q1t1_write 3:write_bandwidth_kb: 46957.00 :small_red_triangle: ( previous job: 95454.00, degradation: 49.19%)
- dom0_root_rnd4k_q32t1_read 3:read_bandwidth_kb: 23101.00 :small_red_triangle: ( previous job: 79803.00, degradation: 28.95%)
- dom0_root_rnd4k_q32t1_write 3:write_bandwidth_kb: 1985.00 :small_red_triangle: ( previous job: 6149.00, degradation: 32.28%)
- dom0_root_rnd4k_q1t1_write 3:write_bandwidth_kb: 1275.00 :small_red_triangle: ( previous job: 4826.00, degradation: 26.42%)
- dom0_varlibqubes_seq1m_q8t1_write 3:write_bandwidth_kb: 97901.00 :small_red_triangle: ( previous job: 250795.00, degradation: 39.04%)
- dom0_varlibqubes_rnd4k_q1t1_write 3:write_bandwidth_kb: 3437.00 :small_red_triangle: ( previous job: 4903.00, degradation: 70.10%)
- fedora-41-xfce_root_seq1m_q1t1_write 3:write_bandwidth_kb: 47025.00 :small_red_triangle: ( previous job: 87940.00, degradation: 53.47%)
- fedora-41-xfce_private_rnd4k_q1t1_write 3:write_bandwidth_kb: 540.00 :small_red_triangle: ( previous job: 1130.00, degradation: 47.79%)
- fedora-41-xfce_volatile_seq1m_q8t1_write 3:write_bandwidth_kb: 66451.00 :small_red_triangle: ( previous job: 179949.00, degradation: 36.93%)
- fedora-41-xfce_volatile_seq1m_q1t1_read 3:read_bandwidth_kb: 252851.00 :small_red_triangle: ( previous job: 324737.00, degradation: 77.86%)
- fedora-41-xfce_volatile_rnd4k_q32t1_write 3:write_bandwidth_kb: 3430.00 :small_red_triangle: ( previous job: 5672.00, degradation: 60.47%)
- fedora-41-xfce_volatile_rnd4k_q1t1_write 3:write_bandwidth_kb: 1346.00 :small_red_triangle: ( previous job: 1953.00, degradation: 68.92%)
Remaining performance tests:
56 tests
- debian-12-xfce_exec-root: 29.20 :small_red_triangle: ( previous job: 28.65, degradation: 101.89%)
- debian-12-xfce_socket: 8.69 :small_red_triangle: ( previous job: 8.60, degradation: 101.04%)
- debian-12-xfce_socket-root: 8.45 :green_circle: ( previous job: 8.52, improvement: 99.15%)
- debian-12-xfce_exec-data-simplex: 73.49 :small_red_triangle: ( previous job: 71.62, degradation: 102.60%)
- debian-12-xfce_exec-data-duplex: 72.97 :small_red_triangle: ( previous job: 70.34, degradation: 103.74%)
- debian-12-xfce_exec-data-duplex-root: 69.14 :green_circle: ( previous job: 82.72, improvement: 83.58%)
- debian-12-xfce_socket-data-duplex: 139.09 :green_circle: ( previous job: 156.96, improvement: 88.62%)
- fedora-41-xfce_exec: 9.32 :small_red_triangle: ( previous job: 9.27, degradation: 100.58%)
- fedora-41-xfce_exec-root: 61.39 :green_circle: ( previous job: 61.51, improvement: 99.81%)
- fedora-41-xfce_socket: 9.07 :small_red_triangle: ( previous job: 8.63, degradation: 105.08%)
- fedora-41-xfce_socket-root: 8.55 :green_circle: ( previous job: 8.71, improvement: 98.24%)
- fedora-41-xfce_exec-data-simplex: 73.13 :green_circle: ( previous job: 75.53, improvement: 96.82%)
- fedora-41-xfce_exec-data-duplex: 77.13 :small_red_triangle: ( previous job: 71.56, degradation: 107.79%)
- fedora-41-xfce_exec-data-duplex-root: 94.75 :green_circle: ( previous job: 109.13, improvement: 86.82%)
- fedora-41-xfce_socket-data-duplex: 139.07 :green_circle: ( previous job: 150.61, improvement: 92.34%)
- whonix-gateway-17_exec: 7.23 :small_red_triangle: ( previous job: 6.82, degradation: 106.02%)
- whonix-gateway-17_exec-root: 38.97 :green_circle: ( previous job: 40.43, improvement: 96.38%)
- whonix-gateway-17_socket-root: 7.56 :green_circle: ( previous job: 7.65, improvement: 98.82%)
- whonix-gateway-17_exec-data-simplex: 72.95 :green_circle: ( previous job: 78.32, improvement: 93.14%)
- whonix-gateway-17_exec-data-duplex: 76.55 :green_circle: ( previous job: 76.65, improvement: 99.87%)
- whonix-gateway-17_exec-data-duplex-root: 76.19 :green_circle: ( previous job: 88.52, improvement: 86.07%)
- whonix-gateway-17_socket-data-duplex: 170.15 :green_circle: ( previous job: 171.76, improvement: 99.06%)
- whonix-workstation-17_exec: 8.32 :small_red_triangle: ( previous job: 7.67, degradation: 108.50%)
- whonix-workstation-17_exec-root: 52.73 :green_circle: ( previous job: 58.26, improvement: 90.50%)
- whonix-workstation-17_socket: 8.15 :green_circle: ( previous job: 8.19, improvement: 99.49%)
- whonix-workstation-17_socket-root: 8.87 :small_red_triangle: ( previous job: 8.13, degradation: 109.13%)
- whonix-workstation-17_exec-data-simplex: 78.01 :small_red_triangle: ( previous job: 74.99, degradation: 104.02%)
- whonix-workstation-17_exec-data-duplex: 73.89 :small_red_triangle: ( previous job: 72.71, degradation: 101.63%)
- whonix-workstation-17_exec-data-duplex-root: 102.57 :small_red_triangle: ( previous job: 99.82, degradation: 102.76%)
- whonix-workstation-17_socket-data-duplex: 141.17 :green_circle: ( previous job: 169.50, improvement: 83.28%)
- dom0_root_seq1m_q8t1_write 3:write_bandwidth_kb: 215623.00 :green_circle: ( previous job: 129298.00, improvement: 166.76%)
- dom0_root_rnd4k_q1t1_read 3:read_bandwidth_kb: 10560.00 :small_red_triangle: ( previous job: 10795.00, degradation: 97.82%)
- dom0_varlibqubes_seq1m_q8t1_read 3:read_bandwidth_kb: 501951.00 :green_circle: ( previous job: 382273.00, improvement: 131.31%)
- dom0_varlibqubes_seq1m_q1t1_read 3:read_bandwidth_kb: 437453.00 :small_red_triangle: ( previous job: 437636.00, degradation: 99.96%)
- dom0_varlibqubes_seq1m_q1t1_write 3:write_bandwidth_kb: 171688.00 :small_red_triangle: ( previous job: 184752.00, degradation: 92.93%)
- dom0_varlibqubes_rnd4k_q32t1_read 3:read_bandwidth_kb: 106050.00 :green_circle: ( previous job: 62195.00, improvement: 170.51%)
- dom0_varlibqubes_rnd4k_q32t1_write 3:write_bandwidth_kb: 8788.00 :green_circle: ( previous job: 6479.00, improvement: 135.64%)
- dom0_varlibqubes_rnd4k_q1t1_read 3:read_bandwidth_kb: 8264.00 :green_circle: ( previous job: 7669.00, improvement: 107.76%)
- fedora-41-xfce_root_seq1m_q8t1_read 3:read_bandwidth_kb: 411367.00 :green_circle: ( previous job: 368309.00, improvement: 111.69%)
- fedora-41-xfce_root_seq1m_q8t1_write 3:write_bandwidth_kb: 188031.00 :green_circle: ( previous job: 162081.00, improvement: 116.01%)
- fedora-41-xfce_root_seq1m_q1t1_read 3:read_bandwidth_kb: 296207.00 :small_red_triangle: ( previous job: 318716.00, degradation: 92.94%)
- fedora-41-xfce_root_rnd4k_q32t1_read 3:read_bandwidth_kb: 87855.00 :green_circle: ( previous job: 82694.00, improvement: 106.24%)
- fedora-41-xfce_root_rnd4k_q32t1_write 3:write_bandwidth_kb: 3267.00 :small_red_triangle: ( previous job: 3599.00, degradation: 90.78%)
- fedora-41-xfce_root_rnd4k_q1t1_read 3:read_bandwidth_kb: 7977.00 :small_red_triangle: ( previous job: 8485.00, degradation: 94.01%)
- fedora-41-xfce_root_rnd4k_q1t1_write 3:write_bandwidth_kb: 1571.00 :green_circle: ( previous job: 542.00, improvement: 289.85%)
- fedora-41-xfce_private_seq1m_q8t1_read 3:read_bandwidth_kb: 358978.00 :small_red_triangle: ( previous job: 373957.00, degradation: 95.99%)
- fedora-41-xfce_private_seq1m_q8t1_write 3:write_bandwidth_kb: 189376.00 :green_circle: ( previous job: 170062.00, improvement: 111.36%)
- fedora-41-xfce_private_seq1m_q1t1_read 3:read_bandwidth_kb: 330572.00 :small_red_triangle: ( previous job: 334687.00, degradation: 98.77%)
- fedora-41-xfce_private_seq1m_q1t1_write 3:write_bandwidth_kb: 65840.00 :green_circle: ( previous job: 61534.00, improvement: 107.00%)
- fedora-41-xfce_private_rnd4k_q32t1_read 3:read_bandwidth_kb: 83433.00 :green_circle: ( previous job: 80283.00, improvement: 103.92%)
- fedora-41-xfce_private_rnd4k_q32t1_write 3:write_bandwidth_kb: 3240.00 :green_circle: ( previous job: 2215.00, improvement: 146.28%)
- fedora-41-xfce_private_rnd4k_q1t1_read 3:read_bandwidth_kb: 8734.00 :green_circle: ( previous job: 7540.00, improvement: 115.84%)
- fedora-41-xfce_volatile_seq1m_q8t1_read 3:read_bandwidth_kb: 376914.00 :green_circle: ( previous job: 369868.00, improvement: 101.91%)
- fedora-41-xfce_volatile_seq1m_q1t1_write 3:write_bandwidth_kb: 142636.00 :green_circle: ( previous job: 17567.00, improvement: 811.95%)
- fedora-41-xfce_volatile_rnd4k_q32t1_read 3:read_bandwidth_kb: 76530.00 :small_red_triangle: ( previous job: 79021.00, degradation: 96.85%)
- fedora-41-xfce_volatile_rnd4k_q1t1_read 3:read_bandwidth_kb: 7764.00 :small_red_triangle: ( previous job: 7867.00, degradation: 98.69%)
TC_04_DispVM: test_002_cleanup (error)
{message : RuntimeWarning("coroutine 'DispVM.on_domain_unpaused' wa...TC_04_DispVM: test_003_cleanup_destroyed (error)
{message : RuntimeWarning("coroutine 'DispVM.on_domain_unpaused' wa...
This and other similar failures here look like missing await somewhere.
PipelineRetryFailed
PipelineRetryFailed
PipelineRetryFailed
PipelineRetryFailed
PipelineRetryFailed
On Tue, Mar 25, 2025, 7:57 PM ben-grande @.***> wrote:
PipelineRetryFailed
— Reply to this email directly, view it on GitHub https://github.com/QubesOS/qubes-core-admin/pull/660#issuecomment-2752242692, or unsubscribe https://github.com/notifications/unsubscribe-auth/BCE2O4NTKWBDOXQKNRX7SGT2WGKHNAVCNFSM6AAAAABXT4LNSWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDONJSGI2DENRZGI . You are receiving this because you are subscribed to this thread.Message ID: @.***> [image: ben-grande]ben-grande left a comment (QubesOS/qubes-core-admin#660) https://github.com/QubesOS/qubes-core-admin/pull/660#issuecomment-2752242692
PipelineRetryFailed
— Reply to this email directly, view it on GitHub https://github.com/QubesOS/qubes-core-admin/pull/660#issuecomment-2752242692, or unsubscribe https://github.com/notifications/unsubscribe-auth/BCE2O4NTKWBDOXQKNRX7SGT2WGKHNAVCNFSM6AAAAABXT4LNSWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDONJSGI2DENRZGI . You are receiving this because you are subscribed to this thread.Message ID: @.***>
Looks like unit tests hanged, specifically qubes.tests.api_admin/TC_00_VMs/test_643_vm_create_disposable_preload_autostart ... (or the one after?)
Looks like unit tests hanged, specifically
qubes.tests.api_admin/TC_00_VMs/test_643_vm_create_disposable_preload_autostart ...(or the one after?)
qubes.tests.api_admin/TC_00_VMs/test_643_vm_create_disposable_preload_autostart ... ok started Domain 'runner-32522443-project-22468673-concurrent-0-job-9718764351' destroyed Domain 'runner-32522443-project-22468673-concurrent-0-job-9718764351' has been undefined
Strange, as they don't hang locally, only fail (as I haven't got them to work yet).
% ./run-tests -f qubes.tests.api_admin/TC_00_VMs/test_643_vm_create_disposable_preload_autostart
...
qubes.tests.api_admin/TC_00_VMs/test_643_vm_create_disposable_preload_autostart ... FAIL (AssertionError: AssertionError("event 'domain-preloaded-dispvm-autostart' did not fire on <qubes.tests.TestEmitter object at 0x7019b78e0dd0>"))
FAIL
======================================================================
FAIL: qubes.tests.api_admin/TC_00_VMs/test_643_vm_create_disposable_preload_autostart
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/unittest/mock.py", line 1369, in patched
return func(*newargs, **newkeywargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/src/contrib/qubes-core-admin/qubes/tests/api_admin.py", line 3780, in test_643_vm_create_disposable_preload_autostart
self.assertEventFired(self.emitter, "domain-preloaded-dispvm-autostart")
File "/home/user/src/contrib/qubes-core-admin/qubes/tests/__init__.py", line 709, in assertEventFired
self.fail(
AssertionError: event 'domain-preloaded-dispvm-autostart' did not fire on <qubes.tests.TestEmitter object at 0x7019b78e0dd0>
----------------------------------------------------------------------
Ran 1 test in 5.410s
FAILED (failures=1)
% ./run-tests -f qubes.tests.api_admin/TC_00_VMs/test_643_vm_create_disposable_preload_use
...
qubes.tests.api_admin/TC_00_VMs/test_643_vm_create_disposable_preload_use ... ERROR (RuntimeError: RuntimeError('\n{message : RuntimeWarning("coroutine \'TC_00_VMs.dummy_coro\' was never awaited"), category : \'RuntimeWarning\', filename : \'/home/user/src/contrib/qubes-core-admin/qubes/vm/dispvm.py\', lineno : 396, line : None}'))
ERROR (RuntimeError:
{message : RuntimeWarning("coroutine 'TC_00_VMs.dummy_coro' was never awaited"), category : 'RuntimeWarning', filename : '/home/user/src/contrib/qubes-core-admin/qubes/vm/dispvm.py', lineno : 396, line : None})
======================================================================
ERROR: qubes.tests.api_admin/TC_00_VMs/test_643_vm_create_disposable_preload_use
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python3.11/contextlib.py", line 81, in inner
return func(*args, **kwds)
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/unittest/mock.py", line 1369, in patched
return func(*newargs, **newkeywargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/src/contrib/qubes-core-admin/qubes/tests/api_admin.py", line 3828, in test_643_vm_create_disposable_preload_use
self.assertEqual(dispvm.features.get("internal", False), False)
AssertionError: '1' != False
During handling of the above exception, another exception occurred:
RuntimeError:
{message : RuntimeWarning("coroutine 'TC_00_VMs.dummy_coro' was never awaited"), category : 'RuntimeWarning', filename : '/home/user/src/contrib/qubes-core-admin/qubes/vm/dispvm.py', lineno : 396, line : None}
----------------------------------------------------------------------
Ran 1 test in 0.477s
FAILED (errors=1)
PipelineRetryFailed
PipelineRetry
And also, git history will want a cleanup at some point
Understood. I am doing multiple unclean commits because it is easier to revert something or to review a specific change with quick iterations. By the end, I will probably squash them all together because most of the history was not a fully working setup that had to change multiple times, unless I see that there is a clear distinction that I should make to create new commits or you say that 1 commit with many line changes is not something you want.
Codecov Report
Attention: Patch coverage is 65.20548% with 127 lines in your changes missing coverage. Please review.
Project coverage is 70.51%. Comparing base (
37e49a1) to head (78aacd1). Report is 1 commits behind head on main.
Additional details and impacted files
@@ Coverage Diff @@
## main #660 +/- ##
==========================================
- Coverage 70.60% 70.51% -0.10%
==========================================
Files 61 61
Lines 12984 13312 +328
==========================================
+ Hits 9168 9387 +219
- Misses 3816 3925 +109
| Flag | Coverage Δ | |
|---|---|---|
| unittests | 70.51% <65.20%> (-0.10%) |
:arrow_down: |
Flags with carried forward coverage won't be shown. Click here to find out more.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
Please check if I have not misunderstood the contents tho, my understanding here is deeply imperfect
You understand most of it, all the important parts, so that is great.
docs look good :)
docs look good :)
Would you like more?
- https://github.com/QubesOS/qubes-core-admin-client/pull/354
Your inputs were valuable.
Does preloading a qube block the qubesd event loop? That’s a really bad idea. Preloading a qube should happen asynchronously, but the qube being preloaded should be marked as unusable until it is ready, and attempting to use a qube being preloaded should wait (asynchronously) until the qube is fully ready.
Does preloading a qube block the qubesd event loop?
It doesn't.
-- Best Regards, Marek Marczykowski-Górecki Invisible Things Lab
You probably noticed, but one of the unit tests is not happy; and pylint complains about one too long line...
- TC_20_DispVM_debian-12-xfce: test_014_dvm_run_preload_nogui (error)
qubes.exc.QubesValueError: Qube GUI is 'False' and does not support...
Uhm, looks like I forgot to include core-agent-linux PR in this test...
- integ tests
- journal
- video: start on time 23:04:24
Not all qubes from the second batch loaded, they timed out after 120 seconds when trying to start the qube. The integration test logs don't tell much, but the journal says it was out of memory. It is easy to know that it is the second batch because of the video.
4 qubes trying to preload (I lowered from 5 because 8GB of RAM for this is too little at least before fixing the issue with pause with too much RAM) followed by 4 more qubes trying to preload. 3 qubes failed to start and 3 qmemman messages of failing to satisfy assignments.
Integration tests
qubes.tests.integ.dispvm/TC_20_DispVM_whonix-workstation-17/test_015_dvm_run_preload_race_more
Test race requesting multiple preloaded qubes ... CRITICAL:qubes.tests.integ.dispvm.TC_20_DispVM_whonix-workstation-17.test_015_dvm_run_preload_race_more:starting
[0;31m[0;31m[0;31m[0;31m[0m[0m[0mERROR
ERROR:vm.disp2399:Start failed: Cannot connect to qrexec agent for 120 seconds, see /var/log/xen/console/guest-disp2399.log for details
VM disp2399 start failed at 2025-06-05 19:05:32
ERROR:vm.disp7862:Start failed: Cannot connect to qrexec agent for 120 seconds, see /var/log/xen/console/guest-disp7862.log for details
VM disp7862 start failed at 2025-06-05 19:05:32
ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-55523' coro=<Emitter.fire_event_async() done, defined at /usr/lib/python3.13/site-packages/qubes/events.py:211> exception=ExceptionGroup('unhandled errors in a TaskGroup', [QubesVMError('Cannot connect to qrexec agent for 120 seconds, see /var/log/xen/console/guest-disp7862.log for details')])>
+ Exception Group Traceback (most recent call last):
| File "/usr/lib/python3.13/site-packages/qubes/events.py", line 243, in fire_event_async
| effect = task.result()
| File "/usr/lib/python3.13/site-packages/qubes/vm/mix/dvmtemplate.py", line 283, in on_domain_preload_dispvm_used
| async with asyncio.TaskGroup() as task_group:
| ~~~~~~~~~~~~~~~~~^^
| File "/usr/lib64/python3.13/asyncio/taskgroups.py", line 71, in __aexit__
| return await self._aexit(et, exc)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/lib64/python3.13/asyncio/taskgroups.py", line 173, in _aexit
| raise BaseExceptionGroup(
| ...<2 lines>...
| ) from None
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Traceback (most recent call last):
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 2147, in start_qrexec_daemon
| await self.start_daemon(
| ...<4 lines>...
| )
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 2104, in start_daemon
| raise subprocess.CalledProcessError(
| p.returncode, command, output=stdout, stderr=stderr
| )
| subprocess.CalledProcessError: Command '['runuser', '-u', 'user', '--', '/usr/sbin/qrexec-daemon', '-q', '-u', '2cf97795-c360-436b-ad24-caff898bc2db', '--', '67', 'disp7862', 'user']' returned non-zero exit status 3.
|
| During handling of the above exception, another exception occurred:
|
| Traceback (most recent call last):
| File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 529, in from_appvm
| await dispvm.start()
| File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 608, in start
| await super().start(**kwargs)
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 1532, in start
| await self.start_qrexec_daemon()
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 2155, in start_qrexec_daemon
| raise qubes.exc.QubesVMError(
| ...<5 lines>...
| )
| qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seconds, see /var/log/xen/console/guest-disp7862.log for details
+------------------------------------
ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-55487' coro=<Emitter.fire_event_async() done, defined at /usr/lib/python3.13/site-packages/qubes/events.py:211> exception=ExceptionGroup('unhandled errors in a TaskGroup', [QubesVMError('Cannot connect to qrexec agent for 120 seconds, see /var/log/xen/console/guest-disp2399.log for details')])>
+ Exception Group Traceback (most recent call last):
| File "/usr/lib/python3.13/site-packages/qubes/events.py", line 243, in fire_event_async
| effect = task.result()
| File "/usr/lib/python3.13/site-packages/qubes/vm/mix/dvmtemplate.py", line 283, in on_domain_preload_dispvm_used
| async with asyncio.TaskGroup() as task_group:
| ~~~~~~~~~~~~~~~~~^^
| File "/usr/lib64/python3.13/asyncio/taskgroups.py", line 71, in __aexit__
| return await self._aexit(et, exc)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/lib64/python3.13/asyncio/taskgroups.py", line 173, in _aexit
| raise BaseExceptionGroup(
| ...<2 lines>...
| ) from None
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Traceback (most recent call last):
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 2147, in start_qrexec_daemon
| await self.start_daemon(
| ...<4 lines>...
| )
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 2104, in start_daemon
| raise subprocess.CalledProcessError(
| p.returncode, command, output=stdout, stderr=stderr
| )
| subprocess.CalledProcessError: Command '['runuser', '-u', 'user', '--', '/usr/sbin/qrexec-daemon', '-q', '-u', '4fc2d0d5-5f24-4305-b995-a418df2762bd', '--', '66', 'disp2399', 'user']' returned non-zero exit status 3.
|
| During handling of the above exception, another exception occurred:
|
| Traceback (most recent call last):
| File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 529, in from_appvm
| await dispvm.start()
| File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 608, in start
| await super().start(**kwargs)
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 1532, in start
| await self.start_qrexec_daemon()
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 2155, in start_qrexec_daemon
| raise qubes.exc.QubesVMError(
| ...<5 lines>...
| )
| qubes.exc.QubesVMError: Cannot connect to qrexec agent for 120 seconds, see /var/log/xen/console/guest-disp2399.log for details
+------------------------------------
ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-58312' coro=<DispVM.on_domain_started_dispvm() done, defined at /usr/lib/python3.13/site-packages/qubes/vm/dispvm.py:332> exception=QubesException("Error on Qrexec call to 'qubes.WaitForSession' during preload startup")>
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 362, in on_domain_started_dispvm
await asyncio.wait_for(
...<6 lines>...
)
File "/usr/lib64/python3.13/asyncio/tasks.py", line 507, in wait_for
return await fut
^^^^^^^^^
File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 1888, in run_service_for_stdio
raise subprocess.CalledProcessError(
p.returncode, args[0], *stdouterr
)
subprocess.CalledProcessError: Command 'qubes.WaitForSession' returned non-zero exit status 255.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 379, in on_domain_started_dispvm
raise qubes.exc.QubesException(
"Error on Qrexec call to '%s' during preload startup" % service
)
qubes.exc.QubesException: Error on Qrexec call to 'qubes.WaitForSession' during preload startup
ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-58056' coro=<DispVM.on_domain_started_dispvm() done, defined at /usr/lib/python3.13/site-packages/qubes/vm/dispvm.py:332> exception=QubesException("Error on Qrexec call to 'qubes.WaitForSession' during preload startup")>
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 362, in on_domain_started_dispvm
await asyncio.wait_for(
...<6 lines>...
)
File "/usr/lib64/python3.13/asyncio/tasks.py", line 507, in wait_for
return await fut
^^^^^^^^^
File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 1888, in run_service_for_stdio
raise subprocess.CalledProcessError(
p.returncode, args[0], *stdouterr
)
subprocess.CalledProcessError: Command 'qubes.WaitForSession' returned non-zero exit status 255.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 379, in on_domain_started_dispvm
raise qubes.exc.QubesException(
"Error on Qrexec call to '%s' during preload startup" % service
)
qubes.exc.QubesException: Error on Qrexec call to 'qubes.WaitForSession' during preload startup
ERROR:vm.disp7639:Start failed: Timed out Qrexec call to 'qubes.WaitForSession' after '120' seconds during preload startup
VM disp7639 start failed at 2025-06-05 19:06:50
ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-54106' coro=<Emitter.fire_event_async() done, defined at /usr/lib/python3.13/site-packages/qubes/events.py:211> exception=ExceptionGroup('unhandled errors in a TaskGroup', [QubesException("Timed out Qrexec call to 'qubes.WaitForSession' after '120' seconds during preload startup")])>
+ Exception Group Traceback (most recent call last):
| File "/usr/lib/python3.13/site-packages/qubes/events.py", line 243, in fire_event_async
| effect = task.result()
| File "/usr/lib/python3.13/site-packages/qubes/vm/mix/dvmtemplate.py", line 283, in on_domain_preload_dispvm_used
| async with asyncio.TaskGroup() as task_group:
| ~~~~~~~~~~~~~~~~~^^
| File "/usr/lib64/python3.13/asyncio/taskgroups.py", line 71, in __aexit__
| return await self._aexit(et, exc)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/lib64/python3.13/asyncio/taskgroups.py", line 173, in _aexit
| raise BaseExceptionGroup(
| ...<2 lines>...
| ) from None
| ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
+-+---------------- 1 ----------------
| Traceback (most recent call last):
| File "/usr/lib64/python3.13/asyncio/tasks.py", line 507, in wait_for
| return await fut
| ^^^^^^^^^
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 1885, in run_service_for_stdio
| stdouterr = await p.communicate(input=input)
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/lib64/python3.13/asyncio/subprocess.py", line 202, in communicate
| await self.wait()
| File "/usr/lib64/python3.13/asyncio/subprocess.py", line 137, in wait
| return await self._transport._wait()
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
| File "/usr/lib64/python3.13/asyncio/base_subprocess.py", line 248, in _wait
| return await waiter
| ^^^^^^^^^^^^
| asyncio.exceptions.CancelledError
|
| The above exception was the direct cause of the following exception:
|
| Traceback (most recent call last):
| File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 362, in on_domain_started_dispvm
| await asyncio.wait_for(
| ...<6 lines>...
| )
| File "/usr/lib64/python3.13/asyncio/tasks.py", line 506, in wait_for
| async with timeouts.timeout(timeout):
| ~~~~~~~~~~~~~~~~^^^^^^^^^
| File "/usr/lib64/python3.13/asyncio/timeouts.py", line 116, in __aexit__
| raise TimeoutError from exc_val
| TimeoutError
|
| During handling of the above exception, another exception occurred:
|
| Traceback (most recent call last):
| File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 529, in from_appvm
| await dispvm.start()
| File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 608, in start
| await super().start(**kwargs)
| File "/usr/lib/python3.13/site-packages/qubes/vm/qubesvm.py", line 1534, in start
| await self.fire_event_async(
| "domain-start", start_guid=start_guid
| )
| File "/usr/lib/python3.13/site-packages/qubes/events.py", line 243, in fire_event_async
| effect = task.result()
| File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 374, in on_domain_started_dispvm
| raise qubes.exc.QubesException(
| ...<2 lines>...
| )
| qubes.exc.QubesException: Timed out Qrexec call to 'qubes.WaitForSession' after '120' seconds during preload startup
+------------------------------------
WARNING:vm.disp7639:Requested preloaded qube but failed to finish preloading after '144' seconds, falling back to normal disposable
ERROR:asyncio:Task exception was never retrieved
future: <Task finished name='Task-59090' coro=<DispVM.cleanup() done, defined at /usr/lib/python3.13/site-packages/qubes/vm/dispvm.py:573> exception=AttributeError("'DispVM' object has no attribute 'app'")>
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 579, in cleanup
if self not in self.app.domains:
^^^^^^^^
AttributeError: 'DispVM' object has no attribute 'app'
ERROR:app:unhandled exception while calling src=b'dom0' meth=b'admin.vm.CreateDisposable' dest=b'test-inst-dvm' arg=b'' len(untrusted_payload)=0
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/qubes/api/__init__.py", line 333, in respond
response = await self.mgmt.execute(
^^^^^^^^^^^^^^^^^^^^^^^^
untrusted_payload=untrusted_payload
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "/usr/lib/python3.13/site-packages/qubes/api/admin.py", line 1332, in create_disposable
dispvm = await qubes.vm.dispvm.DispVM.from_appvm(appvm)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.13/site-packages/qubes/vm/dispvm.py", line 515, in from_appvm
dispvm = app.add_new_vm(
cls, template=appvm, auto_cleanup=True, **kwargs
)
File "/usr/lib/python3.13/site-packages/qubes/app.py", line 1416, in add_new_vm
qid = self.domains.get_new_unused_qid()
^^^^^^^^^^^^
AttributeError: 'Qubes' object has no attribute 'domains'
FAIL
Journal
There are 3 of these logs
Jun 05 19:04:27.585433 dom0 qmemman.systemstate[1466]: Xen free = 356106234 too small to satisfy assignments! assigned_but_unused=355485297, domdict={'0': {'memory_current': 3972907008, 'memory_actual': 4294967296, 'memory_maximum': 4294967296, 'mem_used': 1433985024, 'id': '0', 'last_target': 4294967296, 'use_hotplug': False, 'no_progress': False, 'slow_memset_react': False}, '1': {'memory_current': 297840640, 'memory_actual': 297840640, 'memory_maximum': 314572800, 'mem_used': None, 'id': '1', 'last_target': 297795584, 'use_hotplug': False, 'no_progress': False, 'slow_memset_react': False}, '2': {'memory_current': 150994944, 'memory_actual': 150994944, 'memory_maximum': 150994944, 'mem_used': None, 'id': '2', 'last_target': 150994944, 'use_hotplug': False, 'no_progress': False, 'slow_memset_react': False}, '3': {'memory_current': 297840640, 'memory_actual': 297840640, 'memory_maximum': 314572800, 'mem_used': None, 'id': '3', 'last_target': 297795584, 'use_hotplug': False, 'no_progress': False, 'slow_memset_react': False}, '4': {'memory_current': 150994944, 'memory_actual': 150994944, 'memory_maximum': 150994944, 'mem_used': None, 'id': '4', 'last_target': 150994944, 'use_hotplug': False, 'no_progress': False, 'slow_memset_react': False}, '5': {'memory_current': 985567232, 'memory_actual': 1002279084, 'memory_maximum': 4194304000, 'mem_used': 355131392, 'id': '5', 'last_target': 1002279084, 'use_hotplug': True, 'no_progress': False, 'slow_memset_react': False}, '6': {'memory_current': 1237733376, 'memory_actual': 1254446533, 'memory_maximum': 4194304000, 'mem_used': 451739648, 'id': '6', 'last_target': 1254446533, 'use_hotplug': True, 'no_progress': False, 'slow_memset_react': False}, '60': {'memory_current': 419495936, 'memory_actual': 419495936, 'memory_maximum': 4194304000, 'mem_used': None, 'id': '60', 'last_target': 419430400, 'use_hotplug': True, 'no_progress': False, 'slow_memset_react': False}, '61': {'memory_current': 419495936, 'memory_actual': 419495936, 'memory_maximum': 4194304000, 'mem_used': None, 'id': '61', 'last_target': 419430400, 'use_hotplug': True, 'no_progress': False, 'slow_memset_react': False}}
In other words, what can be done to fix this? Make Whonix-Workstation preload less qubes (2 or 3) as it is too resource intensive?
Uhm, looks like I forgot to include core-agent-linux PR in this test...
Organized the main issue https://github.com/QubesOS/qubes-issues/issues/1512, use PR's from the MVP section.