Flutter-AssetsAudioPlayer icon indicating copy to clipboard operation
Flutter-AssetsAudioPlayer copied to clipboard

Is there a way to play audio from bytes in Flutter Web?

Open atreeon opened this issue 5 years ago • 16 comments

I have an audio file zipped up on a server. I can download this file and unzip it to bytes in memory (the performance isn't a worry in this use case). Is there a way to play audio from bytes using AssetsAudioPlayer? Or another way to get it to play?

atreeon avatar Jun 30 '20 11:06 atreeon

it's not possible no :/ but If I add this, I have to make it available from android & ios & macos

but it's a good feature to add

florent37 avatar Jun 30 '20 13:06 florent37

thank you @florent37 (apologies, I didn't see your reply!), that would be great (my work around is having a play button that just downloads the audio!)

atreeon avatar Jul 02 '20 15:07 atreeon

I think it is good to be able to get the bytes to be able to make these visual waves.

Waves tutorial Waves demo github

DomingoMG avatar Jul 21 '20 17:07 DomingoMG

This would be quite useful. I just tried the approach to write the bytes to a Blob and generate a URL to play from it.

var blob = new html.Blob(data, 'audio/mp3', 'native');
var url = html.Url.createObjectUrlFromBlob(blob);
await assetsAudioPlayer.open(Audio.network(url));

Not quite sure, if this might work at all.

beevelop avatar Jul 23 '20 16:07 beevelop

@beevelop Did this work?

S-Man42 avatar Feb 10 '22 07:02 S-Man42

I hope it helps you https://github.com/ryanheise/just_audio/issues/187

RicDev116 avatar May 26 '22 14:05 RicDev116

I suppose my issue #754 is a duplicate of this one, or at least connected.

As for Android support, I found this link: https://stackoverflow.com/questions/4281201/feeding-data-from-memory-to-mediaplayer It says it's possible to play the data as a Base64 encoded string, like this:

String url = "data:audio/amr;base64,"+base64EncodedString;
mediaPlayer.setDataSource(url);

For web, a similar approach should be possible, if the suggestion above didn't work: https://stackoverflow.com/questions/17762763/play-wav-sound-file-encoded-in-base64-with-javascript

When it comes to iOS, based on this post I suppose you should be able to play audio directly from a byte array, though it's written in a Xamarin.IOS context and I'm not really familiar with neither Swift nor Xamarin.

mt633 avatar Jan 23 '23 10:01 mt633

Just wanted to let you know that I've successfully manged to create a custom source both using the Base64 approach and as a custom stream (which could be a byte array) for Android and iOS. Haven't attempted this for desktop or web yet since we don't currently need that functionality (might look at it in the future). If anyone would find this helpful, just let me know and I'll share the code.

mt633 avatar Feb 20 '23 12:02 mt633

@mt633 Would love to see it! :)

S-Man42 avatar Feb 20 '23 15:02 S-Man42

I'll see if I could create a branch with the full code soon, but the basic parts are the following:

Swift

Swift classes for playing bytes or a custom stream
// Based on CachingPlayerItem (https://github.com/neekeetab/CachingPlayerItem)
class BytesSlowMoPlayerItem: SlowMoPlayerItem {
    
    class ResourceLoaderDelegate: NSObject, AVAssetResourceLoaderDelegate, URLSessionDelegate, URLSessionDataDelegate, URLSessionTaskDelegate {
        
        var mimeType: String?
        var session: URLSession?
        var mediaData: Data?
        var response: URLResponse?
        var pendingRequests = Set<AVAssetResourceLoadingRequest>()
        weak var owner: BytesSlowMoPlayerItem?
        
        func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
            
            pendingRequests.insert(loadingRequest)
            processPendingRequests()
            return true
            
        }
        
        func resourceLoader(_ resourceLoader: AVAssetResourceLoader, didCancel loadingRequest: AVAssetResourceLoadingRequest) {
            pendingRequests.remove(loadingRequest)
        }
        
        func processPendingRequests() {
            
            // get all fullfilled requests
            let requestsFulfilled = Set<AVAssetResourceLoadingRequest>(pendingRequests.compactMap {
                self.fillInContentInformationRequest($0.contentInformationRequest)
                if self.haveEnoughDataToFulfillRequest($0.dataRequest!) {
                    $0.finishLoading()
                    return $0
                }
                return nil
            })
        
            // remove fulfilled requests from pending requests
            _ = requestsFulfilled.map { self.pendingRequests.remove($0) }

        }
        
        func fillInContentInformationRequest(_ contentInformationRequest: AVAssetResourceLoadingContentInformationRequest?) {
                contentInformationRequest?.contentType = self.mimeType
                contentInformationRequest?.contentLength = Int64(mediaData!.count)
                contentInformationRequest?.isByteRangeAccessSupported = true
                return
        }
        
        func haveEnoughDataToFulfillRequest(_ dataRequest: AVAssetResourceLoadingDataRequest) -> Bool {
            
            let requestedOffset = Int(dataRequest.requestedOffset)
            let requestedLength = dataRequest.requestedLength
            let currentOffset = Int(dataRequest.currentOffset)
            
            guard let songDataUnwrapped = mediaData,
                songDataUnwrapped.count > currentOffset else {
                // Don't have any data at all for this request.
                return false
            }
            
            let bytesToRespond = min(songDataUnwrapped.count - currentOffset, requestedLength)
            let dataToRespond = songDataUnwrapped.subdata(in: Range(uncheckedBounds: (currentOffset, currentOffset + bytesToRespond)))
            dataRequest.respond(with: dataToRespond)
            
            return songDataUnwrapped.count >= requestedLength + requestedOffset
            
        }
        
        deinit {
            session?.invalidateAndCancel()
        }
        
    }
    
    fileprivate let resourceLoaderDelegate = ResourceLoaderDelegate()
    fileprivate let url: URL
    fileprivate let initialScheme: String?
    fileprivate var customFileExtension: String?
    
    weak var delegate: CachingPlayerItemDelegate?
    
    private let cachingPlayerItemScheme = "bytesPlayerItemScheme"
    
    /// Is used for playing from Data.
    init(data: Data, mimeType: String, fileExtension: String) {
        
        guard let fakeUrl = URL(string: cachingPlayerItemScheme + "://whatever/file.\(fileExtension)") else {
            fatalError("internal inconsistency")
        }
        
        self.url = fakeUrl
        self.initialScheme = nil
        
        resourceLoaderDelegate.mediaData = data
        resourceLoaderDelegate.mimeType = mimeType
        
        let asset = AVURLAsset(url: fakeUrl)
        asset.resourceLoader.setDelegate(resourceLoaderDelegate, queue: DispatchQueue.main)
        super.init(asset: asset, automaticallyLoadedAssetKeys: nil)
        resourceLoaderDelegate.owner = self
        
        addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions.new, context: nil)
        
        NotificationCenter.default.addObserver(self, selector: #selector(playbackStalledHandler), name:NSNotification.Name.AVPlayerItemPlaybackStalled, object: self)
        
    }
    
    override open func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
        delegate?.playerItemReadyToPlay?(self)
    }
    
    @objc func playbackStalledHandler() {
        delegate?.playerItemPlaybackStalled?(self)
    }
    
    deinit {
        NotificationCenter.default.removeObserver(self)
        removeObserver(self, forKeyPath: "status")
        resourceLoaderDelegate.session?.invalidateAndCancel()
    }
    
}

// Based on CachingPlayerItem (https://github.com/neekeetab/CachingPlayerItem)
class CustomStreamSlowMoPlayerItem: SlowMoPlayerItem {
            
    class ResourceLoaderDelegate: NSObject, AVAssetResourceLoaderDelegate, URLSessionDelegate, URLSessionDataDelegate, URLSessionTaskDelegate {
        private let channel: FlutterMethodChannel
        var mimeType: String?
        var session: URLSession?
        var mediaData: Data?
        var response: URLResponse?
        let size: Int
        weak var owner: CustomStreamSlowMoPlayerItem?
        
        init(size: Int, channel: FlutterMethodChannel) {
            self.size = size
            self.channel = channel
            
        }
        
        func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
            
            self.fillInContentInformationRequest(loadingRequest.contentInformationRequest)
            self.getBytes(loadingRequest)
            return true
            
        }
        
        func fillInContentInformationRequest(_ contentInformationRequest: AVAssetResourceLoadingContentInformationRequest?) {
                contentInformationRequest?.contentType = self.mimeType
                contentInformationRequest?.contentLength = Int64(self.size)
                contentInformationRequest?.isByteRangeAccessSupported = true
                return
        }
        
        func getBytes(_ request: AVAssetResourceLoadingRequest) -> Void {
            let dataRequest = request.dataRequest!;
            let requestedLength = dataRequest.requestedLength
            let currentOffset = Int(dataRequest.currentOffset)
            
            channel.invokeMethod("player.getBytes", arguments: ["offset": currentOffset, "length": requestedLength]) {result in
                dataRequest.respond(with: (result as! FlutterStandardTypedData).data)
                request.finishLoading()
            }
        }
        
        deinit {
            session?.invalidateAndCancel()
        }
        
    }

    
    fileprivate let resourceLoaderDelegate: ResourceLoaderDelegate
    fileprivate let url: URL
    fileprivate let initialScheme: String?
    fileprivate var customFileExtension: String?
    
    weak var delegate: CachingPlayerItemDelegate?
    
    private let cachingPlayerItemScheme = "bytesPlayerItemScheme"
    
    /// Is used for playing from Data.
    init(name: String, mimeType: String, fileExtension: String, fileSize: Int, channel:FlutterMethodChannel) {
        
        guard let fakeUrl = URL(string: cachingPlayerItemScheme + "://whatever/file.\(fileExtension)") else {
            fatalError("internal inconsistency")
        }
        
        self.url = fakeUrl
        self.initialScheme = nil
        resourceLoaderDelegate = ResourceLoaderDelegate(size: fileSize, channel: channel)
        
        let asset = AVURLAsset(url: fakeUrl)
        asset.resourceLoader.setDelegate(resourceLoaderDelegate, queue: DispatchQueue.main)
        super.init(asset: asset, automaticallyLoadedAssetKeys: nil)
        resourceLoaderDelegate.owner = self
        
        addObserver(self, forKeyPath: "status", options: NSKeyValueObservingOptions.new, context: nil)
        
        NotificationCenter.default.addObserver(self, selector: #selector(playbackStalledHandler), name:NSNotification.Name.AVPlayerItemPlaybackStalled, object: self)
        
    }
    
    override open func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) {
        delegate?.playerItemReadyToPlay?(self)
    }
    
    @objc func playbackStalledHandler() {
        delegate?.playerItemPlaybackStalled?(self)
    }
    
    deinit {
        NotificationCenter.default.removeObserver(self)
        removeObserver(self, forKeyPath: "status")
        resourceLoaderDelegate.session?.invalidateAndCancel()
    }
    
}

@objc protocol CachingPlayerItemDelegate {
    
    /// Is called when the media file is fully downloaded.
    @objc optional func playerItem(_ playerItem: Player.SlowMoPlayerItem, didFinishDownloadingData data: Data)
    
    /// Is called every time a new portion of data is received.
    @objc optional func playerItem(_ playerItem: Player.SlowMoPlayerItem, didDownloadBytesSoFar bytesDownloaded: Int, outOf bytesExpected: Int)
    
    /// Is called after initial prebuffering is finished, means
    /// we are ready to play.
    @objc optional func playerItemReadyToPlay(_ playerItem: Player.SlowMoPlayerItem)
    
    /// Is called when the data being downloaded did not arrive in time to
    /// continue playback.
    @objc optional func playerItemPlaybackStalled(_ playerItem: Player.SlowMoPlayerItem)
    
    /// Is called on downloading error.
    @objc optional func playerItem(_ playerItem: Player.SlowMoPlayerItem, downloadingFailedWith error: Error)
    
}

The pure bytes class can then be called with a base64 string like this:

item = BytesSlowMoPlayerItem(data: Data(base64Encoded: base64Data)!, mimeType: mimeTypeString, fileExtension: fileExtensionString)

And the stream like this:

item = CustomStreamSlowMoPlayerItem(name: fileName, mimeType: mimeTypeString, fileExtension: fileExtensionString, fileSize: fileSize!, channel: channel)

Android

The Android class for handling a custom stream
class CustomDataSource(
    private var size: Long,
    private var getBytes: ((offset: Int, length: Int, onDone: (data: ByteArray) -> Unit) -> Unit),
    timeout: Int = 7000
) : BaseDataSource(/* isNetwork = */false) {
    private var uri: Uri? = null
    private var readPosition = 0
    private var bytesRemaining = 0
    private var opened = false
    private val timeout: Int
    private var attempts: Int = 0

    init {
        this.timeout = timeout
    }

    @Throws(IOException::class)
    override fun open(dataSpec: DataSpec): Long {
        uri = dataSpec.uri
        transferInitializing(dataSpec)
        if (dataSpec.position > size) {
            throw DataSourceException(PlaybackException.ERROR_CODE_IO_READ_POSITION_OUT_OF_RANGE)
        }
        readPosition = dataSpec.position.toInt()
        bytesRemaining = size.toInt() - dataSpec.position.toInt()
        if (dataSpec.length != C.LENGTH_UNSET.toLong()) {
            bytesRemaining = min(bytesRemaining.toLong(), dataSpec.length).toInt()
        }
        opened = true
        transferStarted(dataSpec)
        return if (dataSpec.length != C.LENGTH_UNSET.toLong()) dataSpec.length else bytesRemaining.toLong()
    }

    override fun read(buffer: ByteArray, offset: Int, length: Int): Int {
        var dataLength = length
        if (dataLength == 0) {
            return 0
        } else if (bytesRemaining == 0) {
            return C.RESULT_END_OF_INPUT
        }
        try {
            var done = false
            Handler(Looper.getMainLooper()).post {
                dataLength = min(dataLength, bytesRemaining)
                getBytes(readPosition, dataLength, fun(bytes) {
                    System.arraycopy(bytes, 0, buffer, offset, dataLength)
                    bytesTransferred(bytes.size)
                    readPosition += dataLength
                    bytesRemaining -= dataLength
                    done = true
                })
            }
            // Handler needs to be awaited before return
            while (!done) {
                if (attempts > timeout) throw TimeoutException()
                attempts++
                sleep(1)
            }
            attempts = 0

        } catch (e: InterruptedException) {
            //ignore
        }

        return dataLength
    }

    override fun getUri(): Uri? {
        return uri
    }

    override fun close() {
        if (opened) {
            opened = false
            transferEnded()
        }
        uri = null
    }
}

That class can then be used to create a media source like this:

return ProgressiveMediaSource.Factory {
    CustomDataSource(size, onGetBytes)
}.createMediaSource(
    MediaItem.Builder()
        .setUri(Uri.parse("assets_audio_player://${fileName}.${fileExtension}"))
        .setMimeType(mimeTypeString)
        .build()
)

Where onGetBytes is a method for calling the channel method to get the bytes from Dart. Could look like this:

onGetBytes = { offset: Int, length: Int, current: Int, onDone: (data: ByteArray) -> Unit ->
                        channel.invokeMethod(METHOD_GET_BYTES, mapOf("offset" to offset, "length" to length), object : MethodChannel.Result {
            override fun success(result: Any?) {
                onDone(result as ByteArray)
            }
            override fun error(
                errorCode: String,
                errorMessage: String?,
                errorDetails: Any?
            ) {
                throw AssetAudioPlayerThrowable.PlayerError(Throwable(errorMessage))
            }
            override fun notImplemented() {
                throw NotImplementedError()
            }
        })
}

Dart

On Android, the base64 can be played directly using a Audio.file('data:$mimeType;base64,$base64String') object. For Swift the MIME type, the file extension and the base64 string need to be passed to Swift to then be used in the above class. For both Android and iOS, the stream must pass the MIME type, the file extension, a name of the stream and the full size of the stream.

For simplicity I've simply passed the data required in the path string and then split it up in the native code, though this should be handled better in the future. Here's the abstract class I use for handling a stream:

abstract class AudioStream extends Audio {
  AudioStream(String name,
      {super.playSpeed,
      super.pitch,
      Map<String, String>? headers,
      Metas? metas,
      super.drmConfiguration,
      required String mimeType,
      required String fileExtension})
      : super._(
            path: '$mimeType;$fileExtension;$name',
            audioType: AudioType.custom,
            cached: false,
            metas: metas ?? Metas());

  /// Return the item size
  Future<int> init();
  Future<List<int>> request([int? offset, int? length]);
}

Where an example implementation would look like this:

class MyAudioStream extends AudioStream {
  String filePath;
  MyAudioStream(this.filePath,
      {super.playSpeed,
      super.pitch,
      super.headers,
      super.metas,
      super.drmConfiguration,
      required super.mimeType,
      required super.fileExtension})
      : super(filePath);
  late Uint8List buffer;

  @override
  Future<int> init() async {
    var data = await rootBundle.load(filePath);
    buffer = data.buffer.asUint8List();
    return buffer.length;
  }

  @override
  Future<List<int>> request([int? offset, int? length]) async {
    offset ??= 0;
    length ??= buffer.length;
    var bytesToRespond = min(buffer.length - offset, length);
    var dataToRespond = buffer.sublist(offset, offset + bytesToRespond);
    return dataToRespond;
  }
}

The code does need some cleaning up, but it's working. Hope that helps someone until I get the cleaned up code uploaded in a branch.

mt633 avatar Feb 21 '23 09:02 mt633

HI @mt633 I tried to implement MyAudioStream but seems like either init() or request() is not being called do I have to call them explicitly?

tokome-id avatar Mar 08 '23 04:03 tokome-id

Yes, you need to connect the method channel request to the class methods in the assets_audio_player.dart class.

It could look something like this:

case METHOD_GET_BYTES:
    var currentAudio = _playlist?.currentAudio();
    if (currentAudio != null && currentAudio is AudioStream) {
      int? offset = call.arguments['offset'];
      int? length = call.arguments['length'];
      return await currentAudio.request(offset, length);
    }
    break;

Ideally you should initialize the method in a similar way and add a call in the native code to do that. For simplicity I added it to the _open() method when I started testing this out so that I could pass the size down to the native code directly. However, this freezes the UI while the size is calculated so that's not the best approach in the long run.

I'm still sorting out a few things to make the complete code sharable, but hopefully it shouldn't have to be that far into the future.

mt633 avatar Mar 08 '23 11:03 mt633

Now I've finally published my changes. Both base64 and custom stream for Android/iOS are there. I've also added a few other changes, such as a callback for seeking state, support for custom errors (especially for custom streams) and various bug fixes. Didn't have the time to split them into different branches.

If anyone is interested in trying it out, clone/fork this repository and check out the examples, or add this to your pubspec.yaml:

assets_audio_player:
    git:
      url: https://github.com/mt633/Flutter-AssetsAudioPlayer

Check out the README for example usages.

Let me know if you stumble upon any bugs using it.

mt633 avatar Mar 13 '23 11:03 mt633

@florent37 please merge the code. @mt633 Please raise a PR if not done already. Actually I need to play with byte in case of local download without file.

Faiyyaz avatar Jul 29 '23 00:07 Faiyyaz

@mt633

Which Audio Instance do you use for 'Play a base64 audio string' ?

final assetsAudioPlayer = AssetsAudioPlayer();

try { await assetsAudioPlayer.open( Audio.base64('base64String', fileExtension: 'wav', mimeType: 'audio/wav'), ); } catch (t) { //stream unreachable }

jhveuzfwe avatar Mar 19 '24 08:03 jhveuzfwe

@jhveuzfwe, I'm not sure I understand your question. Are you using my fork or the default release? This is not a feature that is merged with the main repository, so if you use the version from pub.dev it will not work.

If you use my fork, it should be possible to play a song using the code you commented. If not, you should open an issue there instead.

mt633 avatar Mar 19 '24 15:03 mt633