audio_service icon indicating copy to clipboard operation
audio_service copied to clipboard

Android 11 auto run the Code in main function when the cellphone is rebooted

Open tsunhousam91 opened this issue 2 years ago • 5 comments

Feature proposal

Hi , I found that when I reboot my cellphone, the log in main function is printed, that means the main function is called automatically when the cellphone is rebooted. But the behavior I hoped is that the main function only be called when the user click the app launch icon

Motivating use case(s)

Some initial functions written in the main function will be called at the time they should not be called, and some strange error may happen, so I want to avoid the main function be called when the cellphone is rebooted.

audio_service version:

audio_service: ^0.18.1

Device Info

Galaxy M12 SM-M127F/DSN

OS version

Android 11

Flutter Doctor

[✓] Flutter (Channel stable, 2.5.2, on macOS 11.4 20F71 darwin-x64, locale zh-Hant) [✓] Android toolchain - develop for Android devices (Android SDK version 30.0.2) [✓] Xcode - develop for iOS and macOS [✓] Chrome - develop for the web [✓] Android Studio (version 2020.3) [✓] Connected device (2 available)

main.dart (minimal reproduction project)

import 'dart:async';

import 'package:audio_service/audio_service.dart';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:just_audio/just_audio.dart';
import 'package:rxdart/rxdart.dart';

// You might want to provide this using dependency injection rather than a
// global variable.
late AudioHandler _audioHandler;

Future<void> main() async {
  print('main is called');
  _audioHandler = await AudioService.init(
    builder: () => AudioPlayerHandler(),
    config: const AudioServiceConfig(
      androidNotificationChannelId: 'com.ryanheise.myapp.channel.audio',
      androidNotificationChannelName: 'Audio playback',
      androidNotificationOngoing: true,
    ),
  );
  runApp(MyApp());
}

class MyApp extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      title: 'Audio Service Demo',
      theme: ThemeData(primarySwatch: Colors.blue),
      home: MainScreen(),
    );
  }
}

class MainScreen extends StatelessWidget {
  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: const Text('Audio Service Demo'),
      ),
      body: Center(
        child: Column(
          mainAxisAlignment: MainAxisAlignment.center,
          children: [
            // Show media item title
            StreamBuilder<MediaItem?>(
              stream: _audioHandler.mediaItem,
              builder: (context, snapshot) {
                final mediaItem = snapshot.data;
                return Text(mediaItem?.title ?? '');
              },
            ),
            // Play/pause/stop buttons.
            StreamBuilder<bool>(
              stream: _audioHandler.playbackState
                  .map((state) => state.playing)
                  .distinct(),
              builder: (context, snapshot) {
                final playing = snapshot.data ?? false;
                return Row(
                  mainAxisAlignment: MainAxisAlignment.center,
                  children: [
                    _button(Icons.fast_rewind, _audioHandler.rewind),
                    if (playing)
                      _button(Icons.pause, _audioHandler.pause)
                    else
                      _button(Icons.play_arrow, _audioHandler.play),
                    _button(Icons.stop, _audioHandler.stop),
                    _button(Icons.fast_forward, _audioHandler.fastForward),
                  ],
                );
              },
            ),
            // A seek bar.
            StreamBuilder<MediaState>(
              stream: _mediaStateStream,
              builder: (context, snapshot) {
                final mediaState = snapshot.data;
                return Container();
              },
            ),
            // Display the processing state.
            StreamBuilder<AudioProcessingState>(
              stream: _audioHandler.playbackState
                  .map((state) => state.processingState)
                  .distinct(),
              builder: (context, snapshot) {
                final processingState =
                    snapshot.data ?? AudioProcessingState.idle;
                return Text(
                    "Processing state: ${describeEnum(processingState)}");
              },
            ),
          ],
        ),
      ),
    );
  }

  /// A stream reporting the combined state of the current media item and its
  /// current position.
  Stream<MediaState> get _mediaStateStream =>
      Rx.combineLatest2<MediaItem?, Duration, MediaState>(
          _audioHandler.mediaItem,
          AudioService.position,
              (mediaItem, position) => MediaState(mediaItem, position));

  IconButton _button(IconData iconData, VoidCallback onPressed) => IconButton(
    icon: Icon(iconData),
    iconSize: 64.0,
    onPressed: onPressed,
  );
}

class MediaState {
  final MediaItem? mediaItem;
  final Duration position;

  MediaState(this.mediaItem, this.position);
}

/// An [AudioHandler] for playing a single item.
class AudioPlayerHandler extends BaseAudioHandler with SeekHandler {
  static final _item = MediaItem(
    id: 'https://s3.amazonaws.com/scifri-episodes/scifri20181123-episode.mp3',
    album: "Science Friday",
    title: "A Salute To Head-Scratching Science",
    artist: "Science Friday and WNYC Studios",
    duration: const Duration(milliseconds: 5739820),
    artUri: Uri.parse(
        'https://media.wnyc.org/i/1400/1400/l/80/1/ScienceFriday_WNYCStudios_1400.jpg'),
  );

  final _player = AudioPlayer();

  /// Initialise our audio handler.
  AudioPlayerHandler() {
    // So that our clients (the Flutter UI and the system notification) know
    // what state to display, here we set up our audio handler to broadcast all
    // playback state changes as they happen via playbackState...
    _player.playbackEventStream.map(_transformEvent).pipe(playbackState);
    // ... and also the current media item via mediaItem.
    mediaItem.add(_item);

    // Load the player.
    _player.setAudioSource(AudioSource.uri(Uri.parse(_item.id)));
  }

  // In this simple example, we handle only 4 actions: play, pause, seek and
  // stop. Any button press from the Flutter UI, notification, lock screen or
  // headset will be routed through to these 4 methods so that you can handle
  // your audio playback logic in one place.

  @override
  Future<void> play() => _player.play();

  @override
  Future<void> pause() => _player.pause();

  @override
  Future<void> seek(Duration position) => _player.seek(position);

  @override
  Future<void> stop() => _player.stop();

  /// Transform a just_audio event into an audio_service state.
  ///
  /// This method is used from the constructor. Every event received from the
  /// just_audio player will be transformed into an audio_service state so that
  /// it can be broadcast to audio_service clients.
  PlaybackState _transformEvent(PlaybackEvent event) {
    return PlaybackState(
      controls: [
        MediaControl.rewind,
        if (_player.playing) MediaControl.pause else MediaControl.play,
        MediaControl.stop,
        MediaControl.fastForward,
      ],
      systemActions: const {
        MediaAction.seek,
        MediaAction.seekForward,
        MediaAction.seekBackward,
      },
      androidCompactActionIndices: const [0, 1, 3],
      processingState: const {
        ProcessingState.idle: AudioProcessingState.idle,
        ProcessingState.loading: AudioProcessingState.loading,
        ProcessingState.buffering: AudioProcessingState.buffering,
        ProcessingState.ready: AudioProcessingState.ready,
        ProcessingState.completed: AudioProcessingState.completed,
      }[_player.processingState]!,
      playing: _player.playing,
      updatePosition: _player.position,
      bufferedPosition: _player.bufferedPosition,
      speed: _player.speed,
      queueIndex: event.currentIndex,
    );
  }
}

AndroidManifest.xml (minimal reproduction project)

<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.example.ttesttt.testtttt33334">
    <uses-permission android:name="android.permission.WAKE_LOCK"/>
    <uses-permission android:name="android.permission.FOREGROUND_SERVICE"/>
   <application
        android:label="testtttt33334"
        android:icon="@mipmap/ic_launcher">
        <activity
            android:name="com.ryanheise.audioservice.AudioServiceActivity"
            android:launchMode="singleTop"
            android:theme="@style/LaunchTheme"
            android:configChanges="orientation|keyboardHidden|keyboard|screenSize|smallestScreenSize|locale|layoutDirection|fontScale|screenLayout|density|uiMode"
            android:hardwareAccelerated="true"
            android:windowSoftInputMode="adjustResize">
            <!-- Specifies an Android theme to apply to this Activity as soon as
                 the Android process has started. This theme is visible to the user
                 while the Flutter UI initializes. After that, this theme continues
                 to determine the Window background behind the Flutter UI. -->
            <meta-data
              android:name="io.flutter.embedding.android.NormalTheme"
              android:resource="@style/NormalTheme"
              />
            <!-- Displays an Android View that continues showing the launch screen
                 Drawable until Flutter paints its first frame, then this splash
                 screen fades out. A splash screen is useful to avoid any visual
                 gap between the end of Android's launch screen and the painting of
                 Flutter's first frame. -->
            <meta-data
              android:name="io.flutter.embedding.android.SplashScreenDrawable"
              android:resource="@drawable/launch_background"
              />
            <intent-filter>
                <action android:name="android.intent.action.MAIN"/>
                <category android:name="android.intent.category.LAUNCHER"/>
            </intent-filter>
        </activity>
        <!-- Don't delete the meta-data below.
             This is used by the Flutter tool to generate GeneratedPluginRegistrant.java -->
        <meta-data
            android:name="flutterEmbedding"
            android:value="2" />

       <!-- ADD THIS "SERVICE" element -->
       <service android:name="com.ryanheise.audioservice.AudioService">
           <intent-filter>
               <action android:name="android.media.browse.MediaBrowserService" />
           </intent-filter>
       </service>

       <!-- ADD THIS "RECEIVER" element -->
       <receiver android:name="com.ryanheise.audioservice.MediaButtonReceiver" >
           <intent-filter>
               <action android:name="android.intent.action.MEDIA_BUTTON" />
           </intent-filter>
       </receiver>
   </application>
</manifest>

Reproduction step:

  1. install the app on the Android 11 mobile
  2. launch the app first time and close the app
  3. Reboot your cellphone
  4. See the Logcat on the Android studio IDE, you will see the log "main is called" is printed when the reboot is completed (need unlock the screen and enter the cellphone Home page)

tsunhousam91 avatar Nov 29 '21 14:11 tsunhousam91

This behaviour does warrant better documentation, but a main method needs to be written in a way that can delay any unwanted initialisation until the app actually enters the foreground state. But the main method does need to be run to launch the process in order to support Android 11's media session resumption feature.

Regarding the repro case, I would prefer you not to copy and paste entire programs into the issue body, hence the issue form asks for a link to a cloneable git repo that includes the complete set of files I need to run it.

If the official example already reproduces the issue, I would prefer to base the investigation on that.

ryanheise avatar Nov 29 '21 15:11 ryanheise

This behaviour does warrant better documentation, but a main method needs to be written in a way that can delay any unwanted initialisation until the app actually enters the foreground state. But the main method does need to be run to launch the process in order to support Android 11's media session resumption feature.

Regarding the repro case, I would prefer you not to copy and paste entire programs into the issue body, hence the issue form asks for a link to a cloneable git repo that includes the complete set of files I need to run it.

If the official example already reproduces the issue, I would prefer to base the investigation on that.

thank you for quickly reply :) I will try to change my code just like you said.

tsunhousam91 avatar Nov 29 '21 15:11 tsunhousam91

This behaviour does warrant better documentation, but a main method needs to be written in a way that can delay any unwanted initialisation until the app actually enters the foreground state. But the main method does need to be run to launch the process in order to support Android 11's media session resumption feature.

@ryanheise What actually needs to happen here? I think I am missing something because, Android Auto and Android 11 resumption don't work for me if the app hasn't been foregrounded recently. I'm doing quite a bit a of setup in the main method (getting device info, authenticating a user, initializing sharedpreferences/databases). Maybe that's a bad practice, but I haven't found a better method. I can put in an issue with a request for documentation if here isn't the right place to ask.

keaganhilliard avatar Nov 30 '21 02:11 keaganhilliard

It may help to first get an example that reproduces the issue.

I would suggest using a widget that wraps around your app that handles the lazy initialisation of resources. I believe there are two ways you could detect when to trigger initialisation:

  1. If main is run from the background, the UI will have zero dimensions which you could discover with a media query. And when the dimensions become non-zero, that means the UI should actually load.
  2. You might alternatively be able to hook into the WidgetsBindingObserver (although I haven't tested whether you'll receive an event on initialisation).

ryanheise avatar Nov 30 '21 03:11 ryanheise

I'm doing quite a bit a of setup in the main method

Incidentally, in all of that does AudioService.init get called first? If you do other setup before calling this, there is a chance that your audio handler won't be loaded in time to be able to service queries when resurrected from the background. I'm working on code to make that aspect more robust in branch fix/early_method_calls so you might be interested in testing that.

ryanheise avatar Dec 01 '21 13:12 ryanheise