How to pipe ffmpeg output to video_player flutter widget (IOS)
There are several Flutter video players around but no one supports the input formats and the codecs of this powerful ffmpeg-based project. For example, reading RTSPS or RTMPS inputs. The FFmpegKitConfig.registerNewFFmpegPipe() approach would cover almost everything but unfortunately, it seems to be working only on Android. I tried with native and VLC players on IOS but no luck (unless I missed something).
Describe the solution you'd like
Support of FFmpegKitConfig.registerNewFFmpegPipe() on IOS or implement ffplay interface would be really amazing.
Platform IOS
Additional context See old issue https://github.com/tanersener/flutter-ffmpeg/issues/92
Implementing ffplay is one of the items in our long-term roadmap. See the Feature Roadmap.
What exactly do you want us to support regarding the FFmpegKitConfig.registerNewFFmpegPipe() method? It is implemented on iOS and it was working well the last time I checked.
Hi @tanersener thanks for your feedback.
My requirement is to use low latency encrypted streams so I'm testing rtmps and rtsps protocols with FLV format and H264 codec. Without this requirement, I could use HLS or MP4 format where the FFmpegKitConfig.registerNewFFmpegPipe() probably works but I could do it even without ffmpeg using directly a video player.
So doing something like:
FFmpegKitConfig.registerNewFFmpegPipe().then((pipePath) {
print("=> pipePath: " + pipePath.toString());
FFmpegKit.executeAsync(
'-i rtsps://myserver:1937/test -c:v copy -f flv -an -y $pipePath', (Session session) async {},
(Log log) {
// print(log.getMessage());
}, (Statistics statistics) {
print(statistics.getBitrate());
});
setState(() {
pipeStream1 = pipePath!;
});
});
and then VlcPlayerController.file(pipePath) which works for Android well but not for IOS. Were your tests based on MP4 or HLS? I wonder if my problem is because something is missing from VLCPlayer for IOS or if there is something we can do on this project to support it.
I tested pipes with streaming data long time ago. I don't remember the platform, the codec I used or the outcome of that test. These days I mostly test pipes with .mp4 files. And everything works fine.
In ffmpeg-kit, we are creating pipes using the same commands on both platforms. I don't think there is an obvious mistake there.
I'm not sure what exactly is failing on iOS for you. But the player is there. It should be logging something about your file. That can explain where is the problem.
Hey @itforgeuk
What is the error / issue you are experiencing? Is the video on VLC not starting to play?
I've been using FFmpeg-kit lately with Flutter, and it works very nicely on iOS. In my case, I've been streaming a local raw h264 byte-stream into the input-pipe and output to an output-pipe. In order to reduce latency, I've been using HLS as the output format. And I initialized the VLC before executing the ffmpeg-kit, and then triggering the "play" as soon as ffmpeg-kit .getTime() is greater than "1".
Example:
FFmpegKitConfig.registerNewFFmpegPipe().then((inputPipe) async {
FFmpegKitConfig.registerNewFFmpegPipe().then((outputPipe) async {
// Not sure why - but I had to close the output pipe here, otherwise the FFMPEG conversion won't start.
FFmpegKitConfig.closeFFmpegPipe(outputPipe);
// Initializing the VLC Video Player.
setState(() {
_vlcController ??= VlcPlayerController.file(
File(outputPipe),
autoInitialize: true,
autoPlay: false,
hwAcc: HwAcc.auto,
);
});
await FFmpegKit.executeAsync(
// Using HLS for minimum latency
'-y -avioflags direct -max_delay 0 -flags2 showall -f h264 -i $inputPipe -fflags nobuffer+discardcorrupt+noparse+nofillin+ignidx+flush_packets+fastseek -avioflags direct -max_delay 0 -flags low_delay -f hls -hls_time 0 -hls_allow_cache 0 $outputPipe',
// MP4 works too, but it's not the best format for streaming, as it causes additional latency. Example with MP4:
// '-y -avioflags direct -max_delay 0 -flags2 showall -f h264 -i $inputPipe -fflags nobuffer+discardcorrupt+noparse+nofillin+ignidx+flush_packets+fastseek -avioflags direct -max_delay 0 -f mp4 -movflags frag_keyframe+empty_moov $outputPipe',
(session) async {
_ffmpegKitSessionId = session.getSessionId();
},
(log) {
// Please note that outputting logs seem to cause additional latency for some reason.
},
(statistics) async {
if (statistics.getTime() >= 1 && await _vlcController?.isPlaying() == false) {
setState(() {
_vlcController?.play();
});
}
},
);
});
});
Hope this helps...
FYI @tanersener
Hi @orenagiv ,
Thanks for the information. In your above snippet is not very clear how you're feeding your inputPipe but I guess you write data in from somewhere else. I tested a few more things but I'm still not getting the expected results.
I can confirm without the
FFmpegKitConfig.closeFFmpegPipe(outputPipe);
The ffmpeg stays blocked forever no matter when the player is initiated and when tries to start reading this. I wonder if this by itself indicates IOS (15.5) behaviour is not as it was before when this was tested or if this simply means when you close this before write is not a pipe anymore and it is just a simple file.
So, closing the pipe after the creation
FFmpegKitConfig.registerNewFFmpegPipe().then((outputPipe) async {
print("=> outputPipe: " + outputPipe.toString());
// Close it
FFmpegKitConfig.closeFFmpegPipe(outputPipe!);
...
});
makes my FFmpegKit.executeAsync() to work and I can see logs without errors. I tried to initiate VLC when ffmpeg-kit .getTime() > 1 and I managed to play just a few frames but my playback stopped while ffmpeg keeps streaming and saving the output to an FLV format without errors.
What happens here is that the outputPipe has the behaviour of a regular FLV file. It is very easy to confirm the generated "outputPipe" grows while downloading data from a rtmp/rtsp stream so if you initiate the player when ffmpeg-kit .getTime() > 1 it will play the downloaded chunk until the end of the file without to wait for new data/frames.
If I copy this FLV file from the IOS simulator on my Desktop I can play it with VLC.
Changing the approach and re-encoding the input to a valid HLS, could be a solution but in this case, the FFmpegKitConfig.registerNewFFmpegPipe() shouldn't be necessary involved as you can get exactly the same results if you store your HLS output directly to a file/path instead of a pipe.
So pipes are broken for IOS? This is my understanding please correct me if I'm wrong.
Hey @itforgeuk
I'm not sure about what I'm about to say - but as far as I know - FLV is not built for streaming, and it didn't work for me either. HLS however was designed for that purpose, and works for me on iOS (on Android I'm struggling with another issue so couldn't verify yet).
I used the output pipe for convenience purposes, but yes - I could have simply used output file (as long it's HLS too) and then it works just fine.
You can see how I feed the input-pipe in the full example code here:
https://github.com/DragonX-cloud/dji_flutter_plugin/blob/main/example/lib/example.dart
(search for_videoFeedSink)
Hope this helps.
I did a few more tests and comparisons with ffmpeg_kit and VLC player.
I tested a round trip like this:
- Push a local rtsp input to a remote rtsp server
- Pull the input from the remote rtsp server
- Convert the stream to low latency HLS and play it with VLC player on IOS.
I'm not transcoding using everywhere -c:v copy. The latency is disappointing with this approach. I have 6 - 7 seconds so I can't do it this for my project. Doing the same outside of flutter with ffmpeg and ffplay and latency is about 1 sec.
If you have any tips I can give one more try
Hey @itforgeuk,
I’m experiencing the same thing, and from what I’ve investigated so far - my conclusion is that it’s the Flutter VLC Player plugin which causes the delay.
I’m trying different approaches / players to display the ffmpeg output. I’ll let you know if find any solution.
Hey @itforgeuk I've just had a breakthrough :) Still in the middle, so I don't have final conclusions yet - but I've managed to stream the video via "local network" using the package: https://pub.dev/packages/local_assets_server (it allows to serve both the assets folder or any other directory, using the rootDir property)
And I'm using the "enhanced" video player called "Better Player" to play the video from the "network" data-source: https://pub.dev/packages/better_player
I'm working on a branch of mine called feature/native_video_view_or_better_player - You can see the example source-code I'm experimenting with here (work-in-progress):
https://github.com/DragonX-cloud/dji_flutter_plugin/blob/feature/native_video_view_or_better_player/example/lib/example.dart
Once I have a final conclusion if this approach works and produces "0" latency - I'll post all the details here.
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 7 days.
This issue was closed because it has been stalled for 7 days with no activity.