audio_waveforms
audio_waveforms copied to clipboard
Android recorded duration is not same as controller duration
What I do is recording audio via RecorderController:
late RecorderController controller;
controller.onCurrentDuration.listen((duration) {
if (duration >= Duration(seconds: 5) {
_stopRecording(context);
}
});;
Then I check the recorded duration via PlayerController:
late PlayerController player;
_stopRecording(BuildContext context) async {
final voicePath = await controller.stop();
await player.preparePlayer(
path: voicePath,
shouldExtractWaveform: true,
noOfSamples: 56,
volume: 1.0,
);
print("${player.maxDuration}");
}
Basically, ios works fine. In android, controller duration is 5 seconds but when player.maxDuration is printed it is 4.860 seconds so there is a difference between recorded time and controller time.
I expect two values to be same or at least work same on both platforms.
- Device: Samsung S21
- OS: One UI Version 5.1
- Android Version: Android 13
@emreerkaslan The stream you are using is not that accurate, as mentioned in the documentation. It will start or stop when it gets a callback from the native. So there can be some variation in milliseconds. And even if you do this in native alone, there can be some variations. So I would suggest that for your use case, instead of using stream, try using `Future.delay. It may or may not give you the closest results.
Indeed, I did that 2 weeks ago. I think it is not a good workaround though. Do you think two streams can be aligned? I would like to help if it's possible somehow.
I have found native methods for this on Android and iOS, so this will be a good enhancement for the package.