mpris interface unresponsive (maybe due to credential cache panic )
Description the MPRIS interface on systembus becomes unresponsive after some idle time over the night ( the instance is still visible but not introspectable)
busctl introspect org.mpris.MediaPlayer2.spotifyd.instance365 /org/mpris/MediaPlayer2 Failed to introspect object /org/mpris/MediaPlayer2 of service org.mpris.MediaPlayer2.spotifyd.instance365: Connection timed out
there is an error about credential caching in the log which I do not understand
then I tried this
busctl call org.mpris.MediaPlayer2.spotifyd.instance365 /rs/spotifyd/Controls rs.spotifyd.Controls TransferPlayback Call failed: Connection timed out
and then spotifyd crashed and was restarted by my systemd config after that it was working properly again
To Reproduce
- start spotifyd fresh with mpris on systembus
- play music with the android spotify app and select spotifyd as device
- pause music and wait 8 hours without letting anything different connect to spotify
- try to run the first busctl introspect command for your given instance 3 times getting a timeout
- try to run the second busctl command getting again a timeout
- see the logs in my case spotifyd had crashed and then started working again
Expected behavior mpris should be instrospectable and responsive all the time if there is a spotify api login problem it should try to automatically login again
Logs
Dec 13 09:59:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2
Dec 13 10:01:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2
Dec 13 10:03:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2
Dec 13 10:05:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2
Dec 13 10:06:34 DietPi spotifyd[365]: Token cache write ignored (not configured)
Dec 13 10:06:34 DietPi spotifyd[365]: The application panicked (crashed).
Dec 13 10:06:34 DietPi spotifyd[365]: Message: RSpotify not authenticated
Dec 13 10:06:34 DietPi spotifyd[365]: Location: /root/.cargo/registry/src/index.crates.io-1cd66030c949c28d/rspotify-0.12.0/src/clients/base.rs:106
Dec 13 10:06:34 DietPi spotifyd[365]: Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
Dec 13 10:06:34 DietPi spotifyd[365]: Run with RUST_BACKTRACE=full to include source snippets.
Dec 13 10:07:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2
Dec 13 10:09:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2
Dec 13 10:11:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2
Dec 13 10:13:50 DietPi spotifyd[365]: Session[0] strong=4 weak=2
Dec 13 10:15:14 DietPi spotifyd[365]: kMessageTypePlay "" webapi-44219fb334174bc6b2c634d8f9e4f6eb 1651865768 1702457260546 kPlayStatusPause
Dec 13 10:15:14 DietPi spotifyd[365]: Sending status to server: [kPlayStatusStop]
Dec 13 10:15:14 DietPi spotifyd[365]: The application panicked (crashed).
Dec 13 10:15:14 DietPi spotifyd[365]: Message: called `Result::unwrap()` on an `Err` value: PoisonError { .. }
Dec 13 10:15:14 DietPi spotifyd[365]: Location: src/dbus_mpris.rs:84
Dec 13 10:15:14 DietPi spotifyd[365]: Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
Dec 13 10:15:14 DietPi spotifyd[365]: Run with RUST_BACKTRACE=full to include source snippets.
Dec 13 10:15:14 DietPi spotifyd[365]: drop Spirc[0]
Dec 13 10:15:14 DietPi spotifyd[365]: Shutting down player thread ...
Dec 13 10:15:14 DietPi spotifyd[365]: drop PlayerInternal[0]
Dec 13 10:15:14 DietPi spotifyd[365]: PlayerInternal thread finished.
Dec 13 10:15:14 DietPi spotifyd[365]: drop Session[0]
Dec 13 10:15:14 DietPi spotifyd[365]: drop MercuryManager
Dec 13 10:15:14 DietPi spotifyd[365]: drop Dispatch
Dec 13 10:15:14 DietPi systemd[1]: spotifyd.service: Main process exited, code=exited, status=101/n/a
Dec 13 10:15:14 DietPi systemd[1]: spotifyd.service: Failed with result 'exit-code'.
Dec 13 10:15:19 DietPi systemd[1]: spotifyd.service: Scheduled restart job, restart counter is at 1.
Dec 13 10:15:19 DietPi systemd[1]: Stopped Spotifyd (DietPi).
Dec 13 10:15:19 DietPi systemd[1]: Started Spotifyd (DietPi).
Dec 13 10:15:19 DietPi spotifyd[978]: Loading config from "/mnt/dietpi_userdata/spotifyd/spotifyd.conf"
Dec 13 10:15:19 DietPi spotifyd[978]: CliConfig { config_path: Some("/mnt/dietpi_userdata/spotifyd/spotifyd.conf"), no_daemon: true, verbose: true, pid: None, shared_config: SharedConfigValues { usernam
e: Some("taken out for privacy"), username_cmd: None, password: None, password_cmd: None, use_keyring: false, use_mpris: Some(true), dbus_type: Some(System), on_song_change_hook: None, cache_path: Some(
"/mnt/dietpi_userdata/spotifyd/cache"), no-audio-cache: true, backend: Some(Alsa), volume_controller: Some(SoftVolume), device: None, control: None, mixer: None, device_name: Some("DietPi"), bitrate: So
me(Bitrate320), audio_format: None, initial_volume: Some("40"), volume_normalisation: false, normalisation_pregain: None, zeroconf_port: None, proxy: None, device_type: Some(Speaker), autoplay: true, ma
x_cache_size: None } }
Dec 13 10:15:19 DietPi spotifyd[978]: Found user shell: Some("/bin/sh")
Dec 13 10:15:19 DietPi spotifyd[978]: No password specified. Checking password_cmd
Dec 13 10:15:19 DietPi spotifyd[978]: No password_cmd specified
Dec 13 10:15:19 DietPi spotifyd[978]: No proxy specified
Dec 13 10:15:19 DietPi spotifyd[978]: Using software volume controller.
Dec 13 10:15:19 DietPi spotifyd[978]: Ignoring blacklisted access point ap-gew4.spotify.com:4070
Dec 13 10:15:19 DietPi spotifyd[978]: Ignoring blacklisted access point ap-gew4.spotify.com:443
Dec 13 10:15:19 DietPi spotifyd[978]: Ignoring blacklisted access point ap-gew4.spotify.com:80
Dec 13 10:15:19 DietPi spotifyd[978]: Ignoring blacklisted access point ap-gue1.spotify.com:443
Dec 13 10:15:19 DietPi spotifyd[978]: Failed to resolve Access Point: Unable to resolve any viable access points.
Dec 13 10:15:19 DietPi spotifyd[978]: Using fallback "ap.spotify.com:443"
Dec 13 10:15:19 DietPi spotifyd[978]: Connecting to AP "ap.spotify.com:443"
Dec 13 10:15:20 DietPi spotifyd[978]: Authenticated as "xxxxx" !
Dec 13 10:15:20 DietPi spotifyd[978]: new Session[0]
Dec 13 10:15:20 DietPi spotifyd[978]: Session[0] strong=2 weak=1
<!-- They have to be betwenn the `/summary` and the `/details` HTML tags, and inside the backticks '`' -->
Compilation flags
- [x] dbus_mpris
- [ ] dbus_keyring
- [x] alsa_backend
- [ ] portaudio_backend
- [ ] pulseaudio_backend
- [ ] rodio_backend
Versions (please complete the following information):
- OS: DietPi 8.0.2 (raspbian bullseye armv6 image) on RaspberryPi 1B
- Spotifyd: 5565f24 armv6 (arm-unknown-linux-gnueabihf version of rust on armv6 raspbian image)
- cargo: cargo 1.74.1 (ecb9851af 2023-10-18)
I see the same issue. I started playback using the TransferPlayback dbus method and after some time (much less than 8 hours) the log reports a panic, but the process is still alive so it doesn't get restarted. It didn't crash again when doing introspect and TransferPlayback, but as soon as I opened the Spotify app on another machine I got the second panic and the process died.
Here it is with the backtrace enabled
Backtrace
Couldn't fetch metadata from spotify: Nothing playing at the moment.
Token cache write ignored (not configured)
The application panicked (crashed).
Message: RSpotify not authenticated
Location: /home/daniel/.cargo/registry/src/index.crates.io-6f17d22bba15001f/rspotify-0.12.0/src/clients/base.rs:106
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ BACKTRACE ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
⋮ 8 frames hidden ⋮
9: core::panicking::panic_display::h2c6e530b26584522
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/core/src/panicking.rs:196
10: core::panicking::panic_str::h4c43cc2df5473fd1
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/core/src/panicking.rs:171
11: core::option::expect_failed::h39c56eaa3bce6881
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/core/src/option.rs:1988
12: rspotify::clients::base::BaseClient::auth_headers::hadc1668e0558c207
at <unknown source file>:<unknown line>
13: rspotify::clients::base::BaseClient::api_get::h9b23681b5b7dd344
at <unknown source file>:<unknown line>
14: spotifyd::dbus_mpris::get_device_id::h325050474d62ce30
at <unknown source file>:<unknown line>
15: dbus_crossroads::ifacedesc::IfaceBuilder<T>::method_with_cr::{{closure}}::ha001171c8a86d58c
at <unknown source file>:<unknown line>
16: dbus_crossroads::crossroads::Crossroads::handle_message_inner::hdd316cc5c3a84b5e
at <unknown source file>:<unknown line>
17: dbus_crossroads::crossroads::Crossroads::handle_message::hda6f3896ef4ed02f
at <unknown source file>:<unknown line>
18: spotifyd::dbus_mpris::create_dbus_server::{{closure}}::{{closure}}::h2f08b8943cbbdcd2
at <unknown source file>:<unknown line>
19: <dbus::nonblock::SyncConnection as dbus::nonblock::Process>::process_one::h1c8860bd6188ec2d
at <unknown source file>:<unknown line>
20: <dbus_tokio::connection::IOResource<C> as core::future::future::Future>::poll::h58d27e434e2502c6
at <unknown source file>:<unknown line>
21: spotifyd::dbus_mpris::create_dbus_server::{{closure}}::{{closure}}::h88aa61c939a0c92d
at <unknown source file>:<unknown line>
22: tokio::runtime::task::core::Core<T,S>::poll::h2e2c8cdcb2917e71
at <unknown source file>:<unknown line>
23: tokio::runtime::task::harness::Harness<T,S>::poll::h9fec6ef6669f3748
at <unknown source file>:<unknown line>
24: tokio::runtime::scheduler::multi_thread::worker::Context::run_task::h866b916a77bae828
at <unknown source file>:<unknown line>
25: tokio::runtime::scheduler::multi_thread::worker::Context::run::h08ee4a2fe6d3662a
at <unknown source file>:<unknown line>
26: tokio::runtime::context::scoped::Scoped<T>::set::h947f25f0a8746071
at <unknown source file>:<unknown line>
27: tokio::runtime::context::runtime::enter_runtime::h825bec3258685d62
at <unknown source file>:<unknown line>
28: tokio::runtime::scheduler::multi_thread::worker::run::h5c562ff06d9e8e2c
at <unknown source file>:<unknown line>
29: <tokio::runtime::blocking::task::BlockingTask<T> as core::future::future::Future>::poll::hba477662a9b8bfe5
at <unknown source file>:<unknown line>
30: tokio::runtime::task::core::Core<T,S>::poll::h4ab980a4bdf7aac5
at <unknown source file>:<unknown line>
31: tokio::runtime::task::harness::Harness<T,S>::poll::h2c319639f591823e
at <unknown source file>:<unknown line>
32: tokio::runtime::blocking::pool::Inner::run::hffcdd47c378bf96a
at <unknown source file>:<unknown line>
33: std::sys_common::backtrace::__rust_begin_short_backtrace::ha869a53ca4b5b754
at <unknown source file>:<unknown line>
34: core::ops::function::FnOnce::call_once{{vtable.shim}}::h75b4f91f1a148f7f
at <unknown source file>:<unknown line>
35: <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h65e29fa99a122141
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/alloc/src/boxed.rs:2015
36: <alloc::boxed::Box<F,A> as core::ops::function::FnOnce<Args>>::call_once::h0e96a7fd708b99fb
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/alloc/src/boxed.rs:2015
37: std::sys::pal::unix::thread::Thread::new::thread_start::h1a4b48df802ee0a8
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/sys/pal/unix/thread.rs:108
38: start_thread<unknown>
at ./nptl/pthread_create.c:442
39: thread_start<unknown>
at ./misc/../sysdeps/unix/sysv/linux/aarch64/clone.S:79
Run with COLORBT_SHOW_HIDDEN=1 environment variable to disable frame filtering.
The application panicked (crashed).
Message: called `Result::unwrap()` on an `Err` value: PoisonError { .. }
Location: src/dbus_mpris.rs:84
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ BACKTRACE ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
⋮ 9 frames hidden ⋮
10: <spotifyd::dbus_mpris::DbusServer as core::future::future::Future>::poll::h9b68c668beb53c18
at <unknown source file>:<unknown line>
11: <tokio::future::poll_fn::PollFn<F> as core::future::future::Future>::poll::hb787d52c83951b93
at <unknown source file>:<unknown line>
12: tokio::runtime::runtime::Runtime::block_on::h5c2d6b3a3959e000
at <unknown source file>:<unknown line>
13: spotifyd::main::h9bb04d2e3cab8d76
at <unknown source file>:<unknown line>
14: std::sys_common::backtrace::__rust_begin_short_backtrace::h50466c551e020831
at <unknown source file>:<unknown line>
15: std::rt::lang_start::{{closure}}::hf00986dc282a29a9
at <unknown source file>:<unknown line>
16: core::ops::function::impls::<impl core::ops::function::FnOnce<A> for &F>::call_once::hdbbf4bc5b948549a
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/core/src/ops/function.rs:284
17: std::panicking::try::do_call::hd9aa31d1e8310f6a
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panicking.rs:554
18: std::panicking::try::h5b4be1ce5148e504
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panicking.rs:518
19: std::panic::catch_unwind::h66d85bc43d2d9f6e
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panic.rs:142
20: std::rt::lang_start_internal::{{closure}}::h2b8a5b503f27f998
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/rt.rs:148
21: std::panicking::try::do_call::hd7f77e71a4e73fa4
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panicking.rs:554
22: std::panicking::try::h63f06620cf880361
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panicking.rs:518
23: std::panic::catch_unwind::h4b03c83abc824798
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/panic.rs:142
24: std::rt::lang_start_internal::h0ddfe42b2029814a
at /rustc/25ef9e3d85d934b27d9dada2f9dd52b1dc63bb04/library/std/src/rt.rs:148
25: main<unknown>
at <unknown source file>:<unknown line>
26: __libc_start_call_main<unknown>
at ./csu/../sysdeps/nptl/libc_start_call_main.h:58
27: __libc_start_main_impl<unknown>
at ./csu/../csu/libc-start.c:360
Hmm, I think I see the problem in the current code: When the application is idle, we currently don't refresh the token and as such, the dbus calls make use of an invalid token.
We are seeing the same thing with our recent build, when MPRIS is polled but spotifyd has been idle for awhile.
amplipi authbind[3093]: The application panicked (crashed).
May 06 15:09:59 amplipi authbind[3093]: Message: called `Result::unwrap()` on an `Err` value: PoisonError { .. }
May 06 15:09:59 amplipi authbind[3093]: Location: src/dbus_mpris.rs:84
May 06 15:09:59 amplipi authbind[3093]: Backtrace omitted. Run with RUST_BACKTRACE=1 environment variable to display it.
May 06 15:09:59 amplipi authbind[3093]: Run with RUST_BACKTRACE=full to include source snippets.