just_audio icon indicating copy to clipboard operation
just_audio copied to clipboard

[iOS][StreamAudioSource] PlayerException ((-11850) Operation Stopped)

Open andrea689 opened this issue 2 years ago • 34 comments

Which API doesn't behave as documented, and how does it misbehave? When I use a byte array audio, Android and Web works correctly, but with iOS I have this error: PlayerException ((-11850) Operation Stopped)

P.S. I don't know why Android needs android:usesCleartextTraffic="true" to work

Minimal reproduction project

https://github.com/andrea689/just_audio_ios_error

main.dart
import 'dart:convert';
import 'dart:typed_data';

import 'package:collection/collection.dart';
import 'package:flutter/material.dart';
import 'package:http/http.dart' as http;
import 'package:just_audio/just_audio.dart';

void main() {
  runApp(const MyApp());
}

class MyApp extends StatelessWidget {
  const MyApp({Key? key}) : super(key: key);

  // This widget is the root of your application.
  @override
  Widget build(BuildContext context) {
    return const MaterialApp(
      home: MyHomePage(),
    );
  }
}

class MyHomePage extends StatelessWidget {
  const MyHomePage({Key? key}) : super(key: key);

  @override
  Widget build(BuildContext context) {
    return Scaffold(
      appBar: AppBar(
        title: const Text('Sound Test'),
      ),
      body: Center(
        child: FutureBuilder<http.Response>(
          future: http.get(
              Uri.parse('https://filebin.net/4i2f18nheahilka7/audio.json')),
          builder: (context, snapshot) {
            if (snapshot.hasData) {
              final dataBuffer = Uint8List.fromList(
                  List<int>.from(jsonDecode(snapshot.data!.body)['bytes']));
              return SoundPlayerUI(dataBuffer: dataBuffer);
            }
            return const CircularProgressIndicator();
          },
        ),
      ),
    );
  }
}

class SoundPlayerUI extends StatefulWidget {
  final Uint8List dataBuffer;
  const SoundPlayerUI({
    Key? key,
    required this.dataBuffer,
  }) : super(key: key);

  @override
  State<SoundPlayerUI> createState() => _SoundPlayerUIState();
}

class _SoundPlayerUIState extends State<SoundPlayerUI> {
  late AudioPlayer _audioPlayer;
  Duration duration = const Duration();

  @override
  void initState() {
    super.initState();
    _audioPlayer = AudioPlayer();

    _audioPlayer
        .setAudioSource(MyAudioSource(widget.dataBuffer))
        .then((value) => setState(() => duration = value ?? const Duration()))
        .catchError((error) {
      // catch load errors: 404, invalid url ...
      print("An error occured $error");
    });
  }

  @override
  void dispose() {
    _audioPlayer.dispose();
    super.dispose();
  }

  String _printDuration(Duration duration) {
    String twoDigits(int n) => n.toString().padLeft(2, "0");
    String twoDigitMinutes = twoDigits(duration.inMinutes.remainder(60));
    String twoDigitSeconds = twoDigits(duration.inSeconds.remainder(60));
    return "$twoDigitMinutes:$twoDigitSeconds";
  }

  @override
  Widget build(BuildContext context) {
    return Card(
      child: Row(
        children: [
          StreamBuilder<PlayerState>(
            stream: _audioPlayer.playerStateStream,
            builder: (_, snapshot) {
              final processingState = snapshot.data?.processingState;

              if (processingState == ProcessingState.loading ||
                  processingState == ProcessingState.buffering) {
                return Center(
                  child: Container(
                    margin: const EdgeInsets.all(12),
                    width: 24,
                    height: 24,
                    child: const CircularProgressIndicator(),
                  ),
                );
              }

              if (_audioPlayer.playing == false) {
                return IconButton(
                  icon: const Icon(Icons.play_arrow),
                  color: Theme.of(context).colorScheme.primary,
                  onPressed: () {
                    _audioPlayer.play();
                  },
                );
              }

              if (processingState != ProcessingState.completed) {
                return IconButton(
                  icon: const Icon(Icons.pause),
                  color: Theme.of(context).colorScheme.primary,
                  onPressed: () {
                    _audioPlayer.pause();
                  },
                );
              }

              return IconButton(
                icon: const Icon(Icons.replay),
                color: Theme.of(context).colorScheme.primary,
                onPressed: () {
                  _audioPlayer.stop();
                  _audioPlayer.seek(
                    Duration.zero,
                    index: _audioPlayer.effectiveIndices?.firstOrNull,
                  );
                  _audioPlayer.play();
                },
              );
            },
          ),
          Expanded(
            child: StreamBuilder<Duration>(
              stream: _audioPlayer.positionStream,
              builder: (context, snapshot) {
                final currentDuration = snapshot.data ?? const Duration();
                final totalDuration =
                    duration.inMilliseconds == 0 ? 1 : duration.inMilliseconds;
                final position = currentDuration.inMilliseconds / totalDuration;
                return Row(
                  children: [
                    Text(
                      '${_printDuration(currentDuration)} / ${_printDuration(duration)}',
                    ),
                    const SizedBox(width: 16),
                    Expanded(
                      child: ClipRRect(
                        borderRadius:
                            const BorderRadius.all(Radius.circular(10)),
                        child: LinearProgressIndicator(
                          value: position,
                          minHeight: 6,
                        ),
                      ),
                    ),
                    const SizedBox(width: 16),
                  ],
                );
              },
            ),
          ),
        ],
      ),
    );
  }
}

class MyAudioSource extends StreamAudioSource {
  final Uint8List _buffer;

  MyAudioSource(this._buffer) : super(tag: 'MyAudioSource');

  @override
  Future<StreamAudioResponse> request([int? start, int? end]) async {
    // Returning the stream audio response with the parameters
    return StreamAudioResponse(
      sourceLength: _buffer.length,
      contentLength: (start ?? 0) - (end ?? _buffer.length),
      offset: start ?? 0,
      stream: Stream.fromIterable([_buffer.sublist(start ?? 0, end)]),
      contentType: 'audio/wav',
    );
  }
}

To Reproduce (i.e. user steps, not code) Steps to reproduce the behavior:

  1. Open app

Error messages

PlayerException ((-11850) Operation Stopped)

Expected behavior Correct audio playback

Smartphone (please complete the following information):

  • Device: real iPhone6 (iOS 14.7.1) - simulator iPhone 11 Pro Max (iOS 14.5)

Flutter SDK version

Doctor summary (to see all details, run flutter doctor -v):
[✓] Flutter (Channel stable, 2.10.1, on macOS 11.6 20G165 darwin-x64, locale en-GB)
[✓] Android toolchain - develop for Android devices (Android SDK version 30.0.2)
[!] Xcode - develop for iOS and macOS (Xcode 12.5.1)
    ! Flutter recommends a minimum Xcode version of 13.
      Download the latest version or update via the Mac App Store.
[✓] Chrome - develop for the web
[✓] Android Studio (version 2021.1)
[✓] VS Code (version 1.65.2)
[✓] Connected device (5 available)
[✓] HTTP Host Availability

! Doctor found issues in 1 category.

andrea689 avatar Mar 15 '22 17:03 andrea689

You didn't follow the instructions for submitting a minimal steps reproduction project. I will need the link.

ryanheise avatar Mar 16 '22 01:03 ryanheise

@ryanheise sorry, this is the link: https://github.com/andrea689/just_audio_ios_error

andrea689 avatar Mar 16 '22 08:03 andrea689

For sanity, can you try rewriting the same example but hosting the remote file in WAV format rather than JSON? That will make it easier to confirm whether you have valid or invalid audio data.

P.S. I don't know why Android needs android:usesCleartextTraffic="true" to work

Because just_audio creates a proxy on http://localhost:.... to serve stream audio sources and that 'http' rather than 'https' requires the android:usesCleartextTraffic option.

ryanheise avatar Mar 16 '22 10:03 ryanheise

@ryanheise I updated the repo

andrea689 avatar Mar 16 '22 11:03 andrea689

I haven't figured out why it doesn't work yet, however I have discovered that your code will work if you use mp3 instead of wav, so there might be a workaround you could use in the meantime.

ryanheise avatar Mar 16 '22 13:03 ryanheise

@ryanheise unfortunately I only have wav samples.. thanks anyway!

andrea689 avatar Mar 16 '22 13:03 andrea689

You can't convert those wav files to MP3 using ffmpeg or similar?

ryanheise avatar Mar 16 '22 13:03 ryanheise

I should change the endpoint that generates the wav and currently I can't.

Do you think this is a problem that you will be able to solve?

Otherwise I would have to use flutter_sound for iOS and just_audio for Android, but I would like to use only one library.

Until now I have been using flutter_sound which is no longer maintained, so I was migrating to just_audio due to a crash problem in some Android devices (https://github.com/Canardoux/flutter_sound/issues/780)

andrea689 avatar Mar 16 '22 14:03 andrea689

Another workaround that should work is to download the json, reconstruct the raw byte data, write that to a file with a .wav filename extension, and then use AudioSource.uri with Uri.file(filePath).

ryanheise avatar Mar 16 '22 14:03 ryanheise

ok, now I try it, thanks

andrea689 avatar Mar 16 '22 14:03 andrea689

It works!

I decided to write to file for Android and iOS, and leave the byte array in the web. This way, no http proxy is needed on Android.

Many thanks!

andrea689 avatar Mar 16 '22 15:03 andrea689

Glad to hear.

Let's still keep this issue open, though, since I will eventually want to look into why StreamAudioSource isn't working with wav content.

ryanheise avatar Mar 16 '22 15:03 ryanheise

@andrea689 I got the same problem, but I am using setUrl method. I fixed it by adding byte range to my request on backend (backend needs to add it).

This is the part from package documentation that I am referring to:

The iOS player relies on server headers (e.g. Content-Type, Content-Length and byte range requests) to know how to decode the file and where applicable to report its duration. In the case of files, iOS relies on the file extension.

MyisCARRY avatar Mar 30 '22 12:03 MyisCARRY

I just ran into this issue as well with AAC files converted with FFMPEG. As you said, it works with MP3 but for me it even works the original WAV file. Here are some sample files you could use to recreate the issue: example-files.zip

As previously mentioned, it works on Android but not on iOS.

I'm trying to protect the file by storing it in a password protected ZIP file and then read the stream from the archive, so I prefer not to unpack the archive and store a temporary file somewhere, even if that would be a functional workaround.

If AAC could work, I'd prefer that over using MP3.

mt633 avatar Apr 13 '22 10:04 mt633

Thanks for providing the test files. I don't have any answers yet as to why this is happening because the proxy headers, including the content type, all looked right to me last time I investigated. Have you tested if your files work fine when pulled directly from some server URL? If that works, it's a matter of comparing the HTTP headers of that server with the headers the proxy generates to see where it's going wrong.

ryanheise avatar Apr 13 '22 11:04 ryanheise

You mean just something like this?

audioPlayer.setUrl('http://localhost:8000/boxaac.m4a');
audioPlayer.play();

If so, then yes, it works.

mt633 avatar Apr 13 '22 11:04 mt633

If it helps, this seems to be the line of code where the library runs into the error: https://github.com/ryanheise/just_audio/blob/29f201dff0a24e62acf07277f3226a504bb9e9d3/just_audio/lib/just_audio.dart#L784

mt633 avatar Apr 13 '22 12:04 mt633

You mean just something like this?

audioPlayer.setUrl('http://localhost:8000/boxaac.m4a');
audioPlayer.play();

If so, then yes, it works.

Wait, what server is that? If that's the proxy itself, then that's certainly not what I meant because in that case there would be no expected difference in headers. Although if it is the proxy you are testing, it is surprising to hear that it works with setUrl.

ryanheise avatar Apr 13 '22 13:04 ryanheise

No, it's just a locally hosted web server to try to stream the file with setUrl. Instead of publishing it online I found it easier to do that.

mt633 avatar Apr 13 '22 13:04 mt633

In that case, I still can't connect to it and check the headers myself. Can you?

ryanheise avatar Apr 13 '22 13:04 ryanheise

I'll see if I can find the headers you're looking for, meanwhile you might want to test e.g. this URL I found when searching GitHub for .m4a. It behaves the same way for me. I can get that URL to play directly in just_audio using setUrl but if I download it and use a custom StreamAudioSource to play it, it won't work.

mt633 avatar Apr 13 '22 13:04 mt633

Not entirely sure what headers you want, but if you point me towards the point in the code you want to check the variables I could do that.

Another interesting finding is that if I change contentType in StreamAudioResponse to contentType: 'audio/wav', instead, the m4a file plays as it should. Setting it to 'audio/aac' or any other format throws the same error as before.

mt633 avatar Apr 14 '22 07:04 mt633

In the code, you can print out the proxy's headers in _proxyHandlerForSource. Then we want to compare those headers with another web server that works. If it's a public web server, I would generally use curl see what headers come back in the response.

It is interesting why putting the wrong content type would cause it to work.

ryanheise avatar Apr 15 '22 02:04 ryanheise

Not sure I fully understand what you are after, but I made a breakpoint here: https://github.com/ryanheise/just_audio/blob/29f201dff0a24e62acf07277f3226a504bb9e9d3/just_audio/lib/just_audio.dart#L3020

That gave the following output from the header variable in the request response. The two continued to look the same the second and third break, then the aac version failed whereas the wav version ran a fourth time and then started playing.

With content type aac

_HttpHeaders (content-type: audio/aac
set-cookie: DARTSESSID=5d318dc2d814d2798f736cacf7f3226e; Path=/; HttpOnly
accept-ranges: bytes
content-length: 2
content-range: bytes 0-1/7347742
)

With content type wav

_HttpHeaders (content-type: audio/wav
set-cookie: DARTSESSID=bcccc9a2cce788738dd66e890a22f4a7; Path=/; HttpOnly
accept-ranges: bytes
content-length: 2
content-range: bytes 0-1/7347742
)

Web server headers

Server: Apache/2.4.51 (Unix) OpenSSL/1.1.1k PHP/8.0.12 mod_perl/2.0.11 Perl/v5.32.1
Last-Modified: Tue, 19 Apr 2022 07:17:36 GMT
ETag: "701e1e-5dcfcabd0a400"
Accept-Ranges: bytes
Content-Length: 7347742
Keep-Alive: timeout=5, max=100
Connection: Keep-Alive

When I had done this, I checked the GitHub URL I posted earlier which returned the following:

Connection: keep-alive
Content-Length: 65407
Cache-Control: max-age=300
content-disposition: attachment; filename=sounds/Beta_m4a/samples/BassWumm_A.m4a
Content-Security-Policy: default-src 'none'; style-src 'unsafe-inline'; sandbox
Content-Type: audio/mp4
ETag: W/"90bee47ac11adb72b15cc1d8018a51c21380a04185237567fb4d8bd6e44e9ca2"
Strict-Transport-Security: max-age=31536000
X-Content-Type-Options: nosniff
X-Frame-Options: deny
X-XSS-Protection: 1; mode=block
X-GitHub-Request-Id: 7498:0E31:B7FC1:EFCE8:625E7AA4
Accept-Ranges: bytes
Date: Tue, 19 Apr 2022 09:02:28 GMT
Via: 1.1 varnish
X-Served-By: cache-bma1633-BMA
X-Cache: MISS
X-Cache-Hits: 0
X-Timer: S1650358948.462872,VS0,VE401
Vary: Authorization,Accept-Encoding,Origin
Access-Control-Allow-Origin: *
X-Fastly-Request-ID: 077f796d939110411cb917232a21e4798809d130
Expires: Tue, 19 Apr 2022 09:07:28 GMT
Source-Age: 0

That made me realize that the correct way to write the MIME type of m4a files is audio/mp4 and using that works for me with just_audio. audio/aac is apparently only for streams (ADTS).

This is in other words no longer an issue for me in my current setup, so I'll leave further investigation to you.

mt633 avatar Apr 19 '22 09:04 mt633

Which API doesn't behave as documented, and how does it misbehave? When I use a byte array audio, Android and Web works correctly, but with iOS I have this error: PlayerException ((-11850) Operation Stopped)

P.S. I don't know why Android needs android:usesCleartextTraffic="true" to work

FYI, I have just updated the iOS setup documentation in the README with the correct documentation for the iOS equivalent of usesCleartextTraffic. I think this section was originally correct but then I added another option last year which is for iOS 10+ which turns of the other option, but you will actually get the correct behaviour on all versions if you use the older iOS 9 option. Details are in the README and the official example's Info.plist.

ryanheise avatar Jun 28 '22 12:06 ryanheise

What the hell. I am streaming AAC with mp4 container. (audio/mp4) I spent almost 4 nights trying to figure out why the player is not working on iOS. After setting the MIME type to audio/mp3 (it's still not an mp3) it suddenly works (almost) perfectly???

cameralis avatar Aug 26 '22 02:08 cameralis

@55nknown are you using a feature that enables the proxy, such as HTTP headers or LockCachingAudioSource or StreamAudioSource?

ryanheise avatar Aug 26 '22 02:08 ryanheise

I am using StreamAudioSource

cameralis avatar Aug 26 '22 11:08 cameralis

I'm running into the same issue, .m4a plays fine before running it through ffmpeg, fails after.

In case another example is at all helpful, here's the command (for debugging purposes I've trimmed it down to just decode and re-encode): -i "var/mobile/.../recording_2022_10_02_24527.m4a" var/mobile/.../recording_2022_10_02_24527_denoised.m4a

Here are the files: testing clips.zip

Let me know if there is anything else I can do to help! In the meantime I'll use a streaming audio source instead of setFile and manually specify the content type as others have done above.

caseycrogers avatar Oct 02 '22 15:10 caseycrogers

@caseycrogers if you're using setFile, then you have a different issue because this issue is about a problem that occurs when using StreamAudioSource. When using setFile, you are depending on iOS's method of using the file extension to determine the file type. just_audio doesn't have a say in what iOS does there, so you would need to read the iOS documentation to see what filename extensions it recognises for what types.

ryanheise avatar Oct 02 '22 15:10 ryanheise