crusader server issue under OpenWRT
It seems to me crusader server can timeout easily after being idle, unlike iperf3.
Is this a known behavior? If yes, how do I workaround the behavior?
FYI, so far I have only run the server from Linux side (Debian 12 LxC container or OpenWRT 23.5).
What do you mean by timeout here?
It will exit by itself after some idling time.
With any error or non-zero process status?
Ler me see if I can capture the exit status.
This is what I do.
root@debian12ct2:~# iperf3 -s &
[1] 863
root@debian12ct2:~# -----------------------------------------------------------
Server listening on 5201 (test #1)
-----------------------------------------------------------
root@debian12ct2:~# ./crusader serve &
[2] 864
root@debian12ct2:~# Server running...
root@debian12ct2:~# ps aux
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 1 0.0 1.1 101952 11648 ? Ss Mar26 0:00 /sbin/init
root 49 0.0 1.1 32952 12024 ? Ss Mar26 0:00 /lib/systemd/systemd-journald
systemd+ 90 0.0 0.8 17860 8704 ? Ss Mar26 0:00 /lib/systemd/systemd-networkd
root 101 0.0 0.1 3600 2048 ? Ss Mar26 0:00 /usr/sbin/cron -f
message+ 102 0.0 0.4 9132 4736 ? Ss Mar26 0:00 /usr/bin/dbus-daemon --system --address=systemd: --no
root 105 0.0 0.7 17156 7936 ? Ss Mar26 0:00 /lib/systemd/systemd-logind
root 151 0.0 0.1 2516 1536 pts/0 Ss+ Mar26 0:00 /sbin/agetty -o -p -- \u --noclear --keep-baud - 1152
root 152 0.0 0.3 6120 3968 pts/1 Ss Mar26 0:00 /bin/login -p --
root 153 0.0 0.1 2516 1536 pts/2 Ss+ Mar26 0:00 /sbin/agetty -o -p -- \u --noclear - linux
root 154 0.0 0.8 15408 9344 ? Ss Mar26 0:00 sshd: /usr/sbin/sshd -D [listener] 0 of 10-100 startu
root 297 0.0 0.4 42652 4632 ? Ss Mar26 0:00 /usr/lib/postfix/sbin/master -w
postfix 299 0.0 0.6 43088 6784 ? S Mar26 0:00 qmgr -l -t unix -u
root 353 0.0 0.4 4980 4224 pts/1 S+ Mar26 0:00 -bash
postfix 855 0.0 0.6 43040 7040 ? S 21:49 0:00 pickup -l -t unix -u -c
root 863 0.0 0.3 8192 3840 pts/1 S 22:59 0:00 iperf3 -s
root 864 0.0 0.2 10452 2976 pts/1 Sl 22:59 0:01 ./crusader serve
root 871 0.0 1.0 17840 11136 ? Ss 23:11 0:00 sshd: root@pts/3
root 874 0.0 0.9 18700 10112 ? Ss 23:11 0:00 /lib/systemd/systemd --user
root 875 0.0 0.4 103012 4560 ? S 23:11 0:00 (sd-pam)
root 893 0.0 0.3 4976 4096 pts/3 Ss 23:11 0:00 -bash
root 898 0.0 0.3 8088 4096 pts/3 R+ 23:11 0:00 ps aux
root@debian12ct2:~# date
Wed Mar 27 23:12:15 UTC 2024
It is working now.
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> .\crusader test 192.168.38.6
Connected to server 192.168.38.6:35481
Latency to server 2.70 ms
Testing download...
Testing upload...
Testing both download and upload...
Warning: Load termination timed out. There may be residual untracked traffic in the background.
Writing data...
Saved raw data as data 2024.03.28 07-58-57.crr
Saved plot as plot 2024.03.28 07-58-57.png
Then the next day iperf3 process will still be there but crusader process is gone.
Note: I am located in Singapore (GMT+8). Now it is in the morning. I will check again this evening and tomorrow morning to see when the process is gone.
Same for OpenWRT, I was running crusader and iperf3 at the same time yesterday and now crusader process is gone.
root@OpenWrt:~# ps | grep iperf3
4031 root 1144 S grep iperf3
15514 root 972 S iperf3 -s
root@OpenWrt:~# ps | grep crusader
4045 root 1148 S grep crusader
root@OpenWrt:~#
.```
Note: I am located in Singapore (GMT+8). Now it is in the morning. I will check again this evening and tomorrow morning to see when the process is gone
Still alive after 10 hours.
root@debian12ct2:~# ps aux
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 1 0.0 1.1 101952 11648 ? Ss Mar26 0:00 /sbin/init
root 49 0.0 1.1 32952 12280 ? Ss Mar26 0:00 /lib/systemd/systemd-journald
systemd+ 90 0.0 0.8 17860 8704 ? Ss Mar26 0:00 /lib/systemd/systemd-networkd
root 101 0.0 0.1 3600 2048 ? Ss Mar26 0:00 /usr/sbin/cron -f
message+ 102 0.0 0.4 9132 4736 ? Ss Mar26 0:00 /usr/bin/dbus-daemon --system --address=systemd: --no
root 105 0.0 0.7 17164 7936 ? Ss Mar26 0:00 /lib/systemd/systemd-logind
root 151 0.0 0.1 2516 1536 pts/0 Ss+ Mar26 0:00 /sbin/agetty -o -p -- \u --noclear --keep-baud - 1152
root 152 0.0 0.3 6120 3968 pts/1 Ss Mar26 0:00 /bin/login -p --
root 153 0.0 0.1 2516 1536 pts/2 Ss+ Mar26 0:00 /sbin/agetty -o -p -- \u --noclear - linux
root 154 0.0 0.8 15408 9344 ? Ss Mar26 0:00 sshd: /usr/sbin/sshd -D [listener] 0 of 10-100 startu
root 297 0.0 0.4 42652 4632 ? Ss Mar26 0:00 /usr/lib/postfix/sbin/master -w
postfix 299 0.0 0.6 43088 6784 ? S Mar26 0:00 qmgr -l -t unix -u
root 353 0.0 0.4 4980 4224 pts/1 S+ Mar26 0:00 -bash
root 863 0.0 0.3 8192 3840 pts/1 S Mar27 0:00 iperf3 -s
root 864 0.0 0.2 10464 2960 pts/1 Sl Mar27 0:02 ./crusader serve
postfix 1156 0.0 0.6 43040 7040 ? S 09:30 0:00 pickup -l -t unix -u -c
root 1161 0.0 1.0 17840 11136 ? Ss 10:46 0:00 sshd: root@pts/3
root 1164 0.0 0.9 18700 9984 ? Ss 10:47 0:00 /lib/systemd/systemd --user
root 1165 0.0 0.4 103012 4572 ? S 10:47 0:00 (sd-pam)
root 1183 0.0 0.3 4976 4096 pts/3 Ss 10:47 0:00 -bash
root 1186 0.0 0.3 8088 4096 pts/3 R+ 10:47 0:00 ps aux
root@debian12ct2:~# date
Thu Mar 28 10:47:20 UTC 2024
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> .\crusader test 192.168.38.6
Connected to server 192.168.38.6:35481
Latency to server 2.61 ms
Testing download...
Testing upload...
Testing both download and upload...
Warning: Server overload detected during test. Result should be discarded.
Warning: Load termination timed out. There may be residual untracked traffic in the background.
Writing data...
Saved raw data as data 2024.03.28 18-48-21.crr
Saved plot as plot 2024.03.28 18-48-21.png
Let me monitor in the next two days to see if this is a false alarm or not. Thanks.
I started the crusader server process under OpenWRT 23.05 a few hours ago as well, But then it seems to be dead after I logged in to check the status (OK) and then exit (GONE).
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> ssh [email protected]
[email protected]'s password:
BusyBox v1.36.1 (2023-11-14 13:38:11 UTC) built-in shell (ash)
_______ ________ __
| |.-----.-----.-----.| | | |.----.| |_
| - || _ | -__| || | | || _|| _|
|_______|| __|_____|__|__||________||__| |____|
|__| W I R E L E S S F R E E D O M
-----------------------------------------------------
OpenWrt 23.05.2, r23630-842932a63d
-----------------------------------------------------
root@OpenWrt:~# date
Thu Mar 28 14:31:24 UTC 2024
root@OpenWrt:~# ps | grep iperf3
4446 root 1144 R grep iperf3
15514 root 972 S iperf3 -s
root@OpenWrt:~# ps | grep crusader
4468 root 1148 R grep crusader
6200 root 10432 S ./crusader serve
root@OpenWrt:~# exit
Connection to 192.168.38.1 closed.
I ran the test on the client side and the test failed.
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> .\crusader test 192.168.38.1
Connected to server 192.168.38.1:35481
thread 'tokio-runtime-worker' panicked at crusader-lib\src\test.rs:791:71:
called `Result::unwrap()` on an `Err` value: "Expected object"
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> .\crusader test 192.168.38.1
thread 'main' panicked at crusader-lib\src\test.rs:1318:10:
called `Result::unwrap()` on an `Err` value: Os { code: 10061, kind: ConnectionRefused, message: "No connection could be made because the target machine actively refused it." }
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> ssh [email protected]
Then I logged in again, the process was gone.
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> ssh [email protected]
[email protected]'s password:
BusyBox v1.36.1 (2023-11-14 13:38:11 UTC) built-in shell (ash)
_______ ________ __
| |.-----.-----.-----.| | | |.----.| |_
| - || _ | -__| || | | || _|| _|
|_______|| __|_____|__|__||________||__| |____|
|__| W I R E L E S S F R E E D O M
-----------------------------------------------------
OpenWrt 23.05.2, r23630-842932a63d
-----------------------------------------------------
root@OpenWrt:~# date
Thu Mar 28 14:39:24 UTC 2024
root@OpenWrt:~# ps | grep iperf3
6281 root 1144 S grep iperf3
15514 root 972 S iperf3 -s
root@OpenWrt:~# ps | grep crusader
6289 root 1148 S grep crusader
No issues with Debian 12 LxC container.
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> ssh [email protected]
[email protected]'s password:
Linux debian12ct2 6.5.11-7-pve #1 SMP PREEMPT_DYNAMIC PMX 6.5.11-7 (2023-12-05T09:44Z) x86_64
The programs included with the Debian GNU/Linux system are free software;
the exact distribution terms for each program are described in the
individual files in /usr/share/doc/*/copyright.
Debian GNU/Linux comes with ABSOLUTELY NO WARRANTY, to the extent
permitted by applicable law.
Last login: Thu Mar 28 10:47:01 2024 from 192.168.38.120
root@debian12ct2:~# ps aux | grep iperf3
root 863 0.0 0.3 8192 3840 pts/1 S Mar27 0:00 iperf3 -s
root 1238 0.0 0.1 3324 1536 pts/3 S+ 14:44 0:00 grep iperf3
root@debian12ct2:~# ps aux | grep crusader
root 864 0.0 0.2 10500 2996 pts/1 Sl Mar27 0:03 ./crusader serve
root 1240 0.0 0.1 3324 1536 pts/3 S+ 14:44 0:00 grep crusader
root@debian12ct2:~# exit
logout
Connection to 192.168.38.6 closed.
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> .\crusader test 192.168.38.6
Connected to server 192.168.38.6:35481
Latency to server 2.42 ms
Testing download...
Testing upload...
Testing both download and upload...
Warning: Load termination timed out. There may be residual untracked traffic in the background.
Writing data...
Saved raw data as data 2024.03.28 22-45-28.crr
Saved plot as plot 2024.03.28 22-45-28.png
Okay, the issue is really only under OpenWRT.
Step to reproduce: Initially there are no issues.
OpenWRT side:
root@OpenWrt:~# ./crusader serve &
root@OpenWrt:~# Server running...
root@OpenWrt:~# ps | grep crusader
9484 root 10432 S ./crusader serve
9546 root 1148 R grep crusader
root@OpenWrt:~# Serving 192.168.38.120:3930, version 3
Serving complete for 192.168.38.120:3930
root@OpenWrt:~# exit
Connection to 192.168.38.1 closed.
Windows client side:
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> .\crusader test 192.168.38.1
Connected to server 192.168.38.1:35481
Latency to server 2.76 ms
Testing download...
Testing upload...
Testing both download and upload...
Writing data...
Saved raw data as data 2024.03.29 11-21-47.crr
Saved plot as plot 2024.03.29 11-21-47.png
PS C:\work\speedtest> ssh [email protected]
[email protected]'s password:
BusyBox v1.36.1 (2023-11-14 13:38:11 UTC) built-in shell (ash)
_______ ________ __
| |.-----.-----.-----.| | | |.----.| |_
| - || _ | -__| || | | || _|| _|
|_______|| __|_____|__|__||________||__| |____|
|__| W I R E L E S S F R E E D O M
-----------------------------------------------------
OpenWrt 23.05.2, r23630-842932a63d
-----------------------------------------------------
root@OpenWrt:~# ps | grep crusader
9484 root 10516 S ./crusader serve
9716 root 1148 R grep crusader
Then I ran the test from Windows client again and it failed. From OpenWRT side, we can see the server process is gone.
Windows client side
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> .\crusader test 192.168.38.1
Connected to server 192.168.38.1:35481
thread 'tokio-runtime-worker' panicked at crusader-lib\src\test.rs:791:71:
called `Result::unwrap()` on an `Err` value: "Expected object"
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
root@OpenWrt:~# ps | grep crusader
9756 root 1148 S grep crusader
Debug log:
PS C:\work\speedtest\crusader-x86_64-pc-windows-msvc> .\crusader test 192.168.38.1
Connected to server 192.168.38.1:35481
thread 'tokio-runtime-worker' panicked at crusader-lib\src\test.rs:791:71:
called `Result::unwrap()` on an `Err` value: "Expected object"
stack backtrace:
0: 0x7ff7a538ec83 - <unknown>
1: 0x7ff7a53a9eed - <unknown>
2: 0x7ff7a538b7d1 - <unknown>
3: 0x7ff7a538ea8a - <unknown>
4: 0x7ff7a5390b79 - <unknown>
5: 0x7ff7a539083b - <unknown>
6: 0x7ff7a5391064 - <unknown>
7: 0x7ff7a5390f35 - <unknown>
8: 0x7ff7a538f319 - <unknown>
9: 0x7ff7a5390c44 - <unknown>
10: 0x7ff7a53d2277 - <unknown>
11: 0x7ff7a53d2733 - <unknown>
12: 0x7ff7a52a64e4 - <unknown>
13: 0x7ff7a52d66ad - <unknown>
14: 0x7ff7a5330531 - <unknown>
15: 0x7ff7a532faee - <unknown>
16: 0x7ff7a532f169 - <unknown>
17: 0x7ff7a532b6ed - <unknown>
18: 0x7ff7a5333941 - <unknown>
19: 0x7ff7a5325adb - <unknown>
20: 0x7ff7a531eaef - <unknown>
21: 0x7ff7a531ef75 - <unknown>
22: 0x7ff7a539592c - <unknown>
23: 0x7ff842eb257d - BaseThreadInitThunk
24: 0x7ff844d8aa48 - RtlUserThreadStart
@Zoxc
I have changed the title to reflect the issue I can consistently reproduce. I am using OpenWRT 23.05 x86_64 VM on Proxmox PVE 8.0.
This seems to be similar to the resolved issue.
- https://github.com/Zoxc/crusader/issues/6
@richb-hanover Just wondering if you can test the server on OpenWRT and see if you can reproduce the issue or not. Thanks.
The step to reproduce: must run crusader server in backgroud and must exit OpenWRT log-in once. Before the exit, everything works fine. After the exit, first run will fail and crusader server process will exit. No traces on the OpenWRT side (server side).
PS C:\work\speedtest> ssh [email protected]
[email protected]'s password:
BusyBox v1.36.1 (2023-11-14 13:38:11 UTC) built-in shell (ash)
_______ ________ __
| |.-----.-----.-----.| | | |.----.| |_
| - || _ | -__| || | | || _|| _|
|_______|| __|_____|__|__||________||__| |____|
|__| W I R E L E S S F R E E D O M
-----------------------------------------------------
OpenWrt 23.05.2, r23630-842932a63d
-----------------------------------------------------
root@OpenWrt:~# ./crusader serve &
root@OpenWrt:~# Server running...
Serving 192.168.38.120:4422, version 3
Serving complete for 192.168.38.120:4422
Serving 192.168.38.120:4491, version 3
Serving complete for 192.168.38.120:4491
root@OpenWrt:~# Serving 192.168.38.120:4624, version 3
Serving complete for 192.168.38.120:4624
root@OpenWrt:~# exit
Connection to 192.168.38.1 closed.
PS C:\work\speedtest> ssh [email protected]
[email protected]'s password:
BusyBox v1.36.1 (2023-11-14 13:38:11 UTC) built-in shell (ash)
_______ ________ __
| |.-----.-----.-----.| | | |.----.| |_
| - || _ | -__| || | | || _|| _|
|_______|| __|_____|__|__||________||__| |____|
|__| W I R E L E S S F R E E D O M
-----------------------------------------------------
OpenWrt 23.05.2, r23630-842932a63d
-----------------------------------------------------
root@OpenWrt:~# ps | grep crusader
12012 root 10568 S ./crusader serve
12439 root 1148 R grep crusader
(note: client crashed)
root@OpenWrt:~# ps | grep crusader
12500 root 1148 R grep crusader
root@OpenWrt:~# exit
Connection to 192.168.38.1 closed.
Detailed client side debug log from debug version of crusader.
PS C:\work\speedtest\crusader\src\target\debug> .\crusader test 192.168.38.1
Connected to server 192.168.38.1:35481
thread 'tokio-runtime-worker' panicked at 'called `Result::unwrap()` on an `Err` value: "Expected object"', crusader-lib\src\test.rs:791:71
stack backtrace:
0: 0x7ff6b785fa22 - std::backtrace_rs::backtrace::dbghelp::trace
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\..\..\backtrace\src\backtrace\dbghelp.rs:98
1: 0x7ff6b785fa22 - std::backtrace_rs::backtrace::trace_unsynchronized
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\..\..\backtrace\src\backtrace\mod.rs:66
2: 0x7ff6b785fa22 - std::sys_common::backtrace::_print_fmt
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\sys_common\backtrace.rs:65
3: 0x7ff6b785fa22 - std::sys_common::backtrace::_print::impl$0::fmt
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\sys_common\backtrace.rs:44
4: 0x7ff6b787bddb - core::fmt::write
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\core\src\fmt\mod.rs:1232
5: 0x7ff6b785b85a - std::io::Write::write_fmt<std::sys::windows::stdio::Stderr>
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\io\mod.rs:1684
6: 0x7ff6b785f76b - std::sys_common::backtrace::_print
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\sys_common\backtrace.rs:47
7: 0x7ff6b785f76b - std::sys_common::backtrace::print
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\sys_common\backtrace.rs:34
8: 0x7ff6b7861d79 - std::panicking::default_hook::closure$1
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\panicking.rs:271
9: 0x7ff6b78619fb - std::panicking::default_hook
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\panicking.rs:290
10: 0x7ff6b78624a8 - std::panicking::rust_panic_with_hook
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\panicking.rs:692
11: 0x7ff6b786239e - std::panicking::begin_panic_handler::closure$0
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\panicking.rs:583
12: 0x7ff6b7860409 - std::sys_common::backtrace::__rust_end_short_backtrace<std::panicking::begin_panic_handler::closure_env$0,never$>
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\sys_common\backtrace.rs:150
13: 0x7ff6b7862050 - std::panicking::begin_panic_handler
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\panicking.rs:579
14: 0x7ff6b7887705 - core::panicking::panic_fmt
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\core\src\panicking.rs:64
15: 0x7ff6b7887c26 - core::result::unwrap_failed
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\core\src\result.rs:1750
16: 0x7ff6b732d4ab - enum2$<core::result::Result<enum2$<crusader_lib::protocol::ServerMessage>,alloc::boxed::Box<dyn$<core::error::Error>,alloc::alloc::Global> > >::unwrap<enum2$<crusader_lib::protocol::ServerMessage>,alloc::boxed::Box<dyn$<core::error::Error>,alloc::alloc::G
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\core\src\result.rs:1090
17: 0x7ff6b73d848b - crusader_lib::test::measure_latency::async_fn$0::async_block$0
at C:\work\speedtest\crusader\src\crusader-lib\src\test.rs:791
18: 0x7ff6b74b1ed4 - tokio::runtime::task::core::impl$3::poll::closure$0<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0> >
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\core.rs:165
19: 0x7ff6b740f3a7 - tokio::loom::std::unsafe_cell::UnsafeCell<enum2$<tokio::runtime::task::core::Stage<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0> > > >::with_mut<enum2$<tokio::runtime::task::core::Stage<enum2$<crusader_lib::test::measure_laten
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\loom\std\unsafe_cell.rs:14
20: 0x7ff6b74b1384 - tokio::runtime::task::core::CoreStage<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0> >::poll<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0> >
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\core.rs:155
21: 0x7ff6b73b2d80 - tokio::runtime::task::harness::poll_future::closure$0<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0>,alloc::sync::Arc<tokio::runtime::thread_pool::worker::Shared> >
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\harness.rs:480
22: 0x7ff6b74f0f6b - core::panic::unwind_safe::impl$23::call_once<enum2$<core::task::poll::Poll<tuple$<alloc::vec::Vec<crusader_lib::protocol::LatencyMeasure,alloc::alloc::Global>,tokio_util::codec::framed_read::FramedRead<tokio::net::tcp::split_owned::OwnedReadHalf,tokio_uti
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\core\src\panic\unwind_safe.rs:271
23: 0x7ff6b74e30c8 - std::panicking::try::do_call<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::closure_env$0<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0>,alloc::sync::Arc<tokio::runtime::thread_pool::work
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\panicking.rs:487
24: 0x7ff6b74da762 - std::panicking::try<enum2$<core::task::poll::Poll<tuple$<alloc::vec::Vec<crusader_lib::protocol::LatencyMeasure,alloc::alloc::Global>,tokio_util::codec::framed_read::FramedRead<tokio::net::tcp::split_owned::OwnedReadHalf,tokio_util::codec::length_delimite
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\panicking.rs:451
25: 0x7ff6b74a762b - std::panic::catch_unwind<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::closure_env$0<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0>,alloc::sync::Arc<tokio::runtime::thread_pool::worker::
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\panic.rs:140
26: 0x7ff6b73ae90d - tokio::runtime::task::harness::poll_future<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0>,alloc::sync::Arc<tokio::runtime::thread_pool::worker::Shared> >
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\harness.rs:468
27: 0x7ff6b73b6e7f - tokio::runtime::task::harness::Harness<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0>,alloc::sync::Arc<tokio::runtime::thread_pool::worker::Shared> >::poll_inner<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_blo
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\harness.rs:104
28: 0x7ff6b73c1e83 - tokio::runtime::task::harness::Harness<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0>,alloc::sync::Arc<tokio::runtime::thread_pool::worker::Shared> >::poll<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\harness.rs:57
29: 0x7ff6b74ea81b - tokio::runtime::task::raw::poll<enum2$<crusader_lib::test::measure_latency::async_fn$0::async_block_env$0>,alloc::sync::Arc<tokio::runtime::thread_pool::worker::Shared> >
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\raw.rs:144
30: 0x7ff6b7678b66 - tokio::runtime::task::raw::RawTask::poll
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\raw.rs:84
31: 0x7ff6b7663682 - tokio::runtime::task::LocalNotified<alloc::sync::Arc<tokio::runtime::thread_pool::worker::Shared> >::run<alloc::sync::Arc<tokio::runtime::thread_pool::worker::Shared> >
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\mod.rs:381
32: 0x7ff6b764c118 - tokio::runtime::thread_pool::worker::impl$1::run_task::closure$0
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\thread_pool\worker.rs:435
33: 0x7ff6b765aaa9 - tokio::coop::with_budget::closure$0<enum2$<core::result::Result<alloc::boxed::Box<tokio::runtime::thread_pool::worker::Core,alloc::alloc::Global>,tuple$<> > >,tokio::runtime::thread_pool::worker::impl$1::run_task::closure_env$0>
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\coop.rs:102
34: 0x7ff6b7693b5c - std::thread::local::LocalKey<core::cell::Cell<tokio::coop::Budget> >::try_with<core::cell::Cell<tokio::coop::Budget>,tokio::coop::with_budget::closure_env$0<enum2$<core::result::Result<alloc::boxed::Box<tokio::runtime::thread_pool::worker::Core,alloc::all
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\thread\local.rs:446
35: 0x7ff6b769318e - std::thread::local::LocalKey<core::cell::Cell<tokio::coop::Budget> >::with<core::cell::Cell<tokio::coop::Budget>,tokio::coop::with_budget::closure_env$0<enum2$<core::result::Result<alloc::boxed::Box<tokio::runtime::thread_pool::worker::Core,alloc::alloc::
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\thread\local.rs:422
36: 0x7ff6b764c0e3 - tokio::coop::with_budget
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\coop.rs:95
37: 0x7ff6b764c0e3 - tokio::coop::budget
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\coop.rs:72
38: 0x7ff6b764c0e3 - tokio::runtime::thread_pool::worker::Context::run_task
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\thread_pool\worker.rs:434
39: 0x7ff6b764bde4 - tokio::runtime::thread_pool::worker::Context::run
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\thread_pool\worker.rs:401
40: 0x7ff6b764bc8b - tokio::runtime::thread_pool::worker::run::closure$0
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\thread_pool\worker.rs:386
41: 0x7ff6b765ecc4 - tokio::macros::scoped_tls::ScopedKey<tokio::runtime::thread_pool::worker::Context>::set<tokio::runtime::thread_pool::worker::Context,tokio::runtime::thread_pool::worker::run::closure_env$0,tuple$<> >
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\macros\scoped_tls.rs:61
42: 0x7ff6b764bc43 - tokio::runtime::thread_pool::worker::run
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\thread_pool\worker.rs:383
43: 0x7ff6b764bb6e - tokio::runtime::thread_pool::worker::impl$0::launch::closure$0
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\thread_pool\worker.rs:362
44: 0x7ff6b768243a - tokio::runtime::blocking::task::impl$2::poll<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0,tuple$<> >
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\blocking\task.rs:42
45: 0x7ff6b7640d3c - tokio::runtime::task::core::impl$3::poll::closure$0<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0> >
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\core.rs:165
46: 0x7ff6b768001f - tokio::loom::std::unsafe_cell::UnsafeCell<enum2$<tokio::runtime::task::core::Stage<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0> > > >::with_mut<enum2$<tokio::runtime::task::core::Stage<to
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\loom\std\unsafe_cell.rs:14
47: 0x7ff6b7640c47 - tokio::runtime::task::core::CoreStage<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0> >::poll<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\core.rs:155
48: 0x7ff6b7692fd8 - tokio::runtime::task::harness::poll_future::closure$0<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0>,tokio::runtime::blocking::schedule::NoopSchedule>
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\harness.rs:480
49: 0x7ff6b7683ca3 - core::panic::unwind_safe::impl$23::call_once<enum2$<core::task::poll::Poll<tuple$<> > >,tokio::runtime::task::harness::poll_future::closure_env$0<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\core\src\panic\unwind_safe.rs:271
50: 0x7ff6b76aa769 - std::panicking::try::do_call<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::closure_env$0<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0>,tokio::runti
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\panicking.rs:487
51: 0x7ff6b76aa43d - std::panicking::try<enum2$<core::task::poll::Poll<tuple$<> > >,core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::closure_env$0<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::l
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\panicking.rs:451
52: 0x7ff6b766c5db - std::panic::catch_unwind<core::panic::unwind_safe::AssertUnwindSafe<tokio::runtime::task::harness::poll_future::closure_env$0<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0>,tokio::runtime::
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\panic.rs:140
53: 0x7ff6b7692b0b - tokio::runtime::task::harness::poll_future<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0>,tokio::runtime::blocking::schedule::NoopSchedule>
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\harness.rs:468
54: 0x7ff6b76914d6 - tokio::runtime::task::harness::Harness<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0>,tokio::runtime::blocking::schedule::NoopSchedule>::poll_inner<tokio::runtime::blocking::task::BlockingT
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\harness.rs:104
55: 0x7ff6b7691373 - tokio::runtime::task::harness::Harness<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0>,tokio::runtime::blocking::schedule::NoopSchedule>::poll<tokio::runtime::blocking::task::BlockingTask<to
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\harness.rs:57
56: 0x7ff6b7678c9b - tokio::runtime::task::raw::poll<tokio::runtime::blocking::task::BlockingTask<tokio::runtime::thread_pool::worker::impl$0::launch::closure_env$0>,tokio::runtime::blocking::schedule::NoopSchedule>
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\raw.rs:144
57: 0x7ff6b7678b66 - tokio::runtime::task::raw::RawTask::poll
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\raw.rs:84
58: 0x7ff6b76636e7 - tokio::runtime::task::UnownedTask<tokio::runtime::blocking::schedule::NoopSchedule>::run<tokio::runtime::blocking::schedule::NoopSchedule>
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\task\mod.rs:418
59: 0x7ff6b7671dc5 - tokio::runtime::blocking::pool::Task::run
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\blocking\pool.rs:91
60: 0x7ff6b7672f47 - tokio::runtime::blocking::pool::Inner::run
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\blocking\pool.rs:325
61: 0x7ff6b7672dbf - tokio::runtime::blocking::pool::impl$4::spawn_thread::closure$0
at C:\Users\xiaof\.cargo\registry\src\github.com-1ecc6299db9ec823\tokio-1.20.1\src\runtime\blocking\pool.rs:300
62: 0x7ff6b765cca9 - std::sys_common::backtrace::__rust_begin_short_backtrace<tokio::runtime::blocking::pool::impl$4::spawn_thread::closure_env$0,tuple$<> >
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\sys_common\backtrace.rs:134
63: 0x7ff6b765cca9 - std::sys_common::backtrace::__rust_begin_short_backtrace<tokio::runtime::blocking::pool::impl$4::spawn_thread::closure_env$0,tuple$<> >
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\sys_common\backtrace.rs:134
64: 0x7ff6b767e081 - std::thread::impl$0::spawn_unchecked_::closure$1::closure$0<tokio::runtime::blocking::pool::impl$4::spawn_thread::closure_env$0,tuple$<> >
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\thread\mod.rs:560
65: 0x7ff6b7683d01 - core::panic::unwind_safe::impl$23::call_once<tuple$<>,std::thread::impl$0::spawn_unchecked_::closure$1::closure_env$0<tokio::runtime::blocking::pool::impl$4::spawn_thread::closure_env$0,tuple$<> > >
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\core\src\panic\unwind_safe.rs:271
66: 0x7ff6b76aad09 - std::panicking::try::do_call<core::panic::unwind_safe::AssertUnwindSafe<std::thread::impl$0::spawn_unchecked_::closure$1::closure_env$0<tokio::runtime::blocking::pool::impl$4::spawn_thread::closure_env$0,tuple$<> > >,tuple$<> >
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\panicking.rs:487
67: 0x7ff6b76aa0f6 - std::panicking::try<tuple$<>,core::panic::unwind_safe::AssertUnwindSafe<std::thread::impl$0::spawn_unchecked_::closure$1::closure_env$0<tokio::runtime::blocking::pool::impl$4::spawn_thread::closure_env$0,tuple$<> > > >
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\panicking.rs:451
68: 0x7ff6b766c609 - std::panic::catch_unwind<core::panic::unwind_safe::AssertUnwindSafe<std::thread::impl$0::spawn_unchecked_::closure$1::closure_env$0<tokio::runtime::blocking::pool::impl$4::spawn_thread::closure_env$0,tuple$<> > >,tuple$<> >
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\panic.rs:140
69: 0x7ff6b767df84 - std::thread::impl$0::spawn_unchecked_::closure$1<tokio::runtime::blocking::pool::impl$4::spawn_thread::closure_env$0,tuple$<> >
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\std\src\thread\mod.rs:559
70: 0x7ff6b7654e5e - core::ops::function::FnOnce::call_once<std::thread::impl$0::spawn_unchecked_::closure_env$1<tokio::runtime::blocking::pool::impl$4::spawn_thread::closure_env$0,tuple$<> >,tuple$<> >
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc\library\core\src\ops\function.rs:250
71: 0x7ff6b7866f4c - alloc::boxed::impl$45::call_once
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\alloc\src\boxed.rs:1987
72: 0x7ff6b7866f4c - alloc::boxed::impl$45::call_once
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\alloc\src\boxed.rs:1987
73: 0x7ff6b7866f4c - std::sys::windows::thread::impl$0::new::thread_start
at /rustc/84c898d65adf2f39a5a98507f1fe0ce10a2b8dbc/library\std\src\sys\windows\thread.rs:56
74: 0x7ff842eb257d - BaseThreadInitThunk
75: 0x7ff844d8aa48 - RtlUserThreadStart
For the Debian/Ubuntu container, occassionally I can also recreate the issue after a while.
mcuee@mcuees-Mac-mini crusader-aarch64-apple-darwin % ./crusader test 192.168.50.15
Connected to server 192.168.50.15:35481
thread 'main' panicked at crusader-lib/src/test.rs:1318:10:
called `Result::unwrap()` on an `Err` value: "Expected object"
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
zsh: abort ./crusader test 192.168.50.15
Once this happens, the server process will exit.
@richb-hanover Just wondering if you can test the server on OpenWRT and see if you can reproduce the issue or not. Thanks.
Sorry, I don't have a platform for testing crusader server on OpenWrt. Hopefully, you'll get clues from the Debian/Ubuntu container...
This should be re-tested with 0.3. Although I'm convinced that running Crusader server on any but the most powerful OpenWrt routers will give bad results because the router isn't set up both to route and source all the traffic without latency/degraded performance
I am mainly using low poert x86_64 mini PC based OpenWRT virtual router (using Intel N100 CPU) on Proxmox PVE 8.x, it is powerful enough to run crusader. I will test this issue again over the weekend.
I usually put another AP behind it (right now either ZTE BE7200 Pro+ or Asus TUF-BE6500, both dual band WiFi 7 router).