audio_waveforms
audio_waveforms copied to clipboard
Recorded audio file from Android can't play in iOS. Android to Android, iOS to iOS & iOS to Android is fine.
Hey @jackwill99, The link you provided requires access can you please make it open, and what configuration did you set for recording an audio file? Because with default settings I'm able to record on android and play it on IOS.
I have the same problem. Any updates?
I'm having the same problem. Is there any solution?
I solved temporary by using ffmpeg package in backend(Laravel). It can solve your problem, but I want to get this feature from Flutter. Thanks guy.
I solved temporary by using ffmpeg package in backend(Laravel). It can solve your problem, but I want to get this feature from Flutter. Thanks guy.
@jackwill99 May I know how to use ffmpeg command?
@jae-om
audio
means your current audio
modify_audio
means converted audio from ffmpeg
In this case, we used 192 in audio resolution.
ffmpeg -i audio -c:a aac -b:a 192k modify_audio
And @ujas-m-simformsolutions , I provided required access in my file and my config is here
void _initialiseControllers() {
recorderController = RecorderController()
..androidEncoder = AndroidEncoder.aac
..androidOutputFormat = AndroidOutputFormat.aac_adts
..iosEncoder = IosEncoder.kAudioFormatMPEG4AAC
..sampleRate = Platform.isAndroid ? 16000 : 44100
..bitRate = Platform.isAndroid ? 64000 : 48000;
}
This code is final and I test by changing androidOutputFormat, sampleRate and bitRate. But, result is the same.
We also notice when audio is recorded from Android, there is no information of Duration,AudioChannels, SampleRate
. But when we recorded from iOS, there has. So, ffmpeg
changes the Standard Audio File
.
@jackwill99 Can you check this in 1.0.1
if this issue still persists with default setting and also I would suggest removing the bitrate parameter.
@ujas-m-simformsolutions Hello, I'm testing with your example nothing changes and the bitrate parameter is already removed in the config as the below -
void _initialiseControllers() {
recorderController = RecorderController()
..androidEncoder = AndroidEncoder.aac
..androidOutputFormat = AndroidOutputFormat.mpeg4
..iosEncoder = IosEncoder.kAudioFormatMPEG4AAC
..sampleRate = 44100;
}
Logs from Xcode,
MP3AudioFile.cpp:1,526 Problem scanning for packets
MP3AudioFile.cpp:1,065 MPEGAudioFile::OpenFromDataSource failed
AudioFileObject.cpp:105 OpenFromDataSource failed
AudioFileObject.cpp:80 Open failed
MP3AudioFile.cpp:1,526 Problem scanning for packets
MP3AudioFile.cpp:1,065 MPEGAudioFile::OpenFromDataSource failed
AudioFileObject.cpp:105 OpenFromDataSource failed
AudioFileObject.cpp:80 Open failed
ExtAudioFile.cpp:211 about to throw 'dta?': open audio file
AVAEInternal.h:109 [AVAudioFile.mm:135:AVAudioFileImpl: (ExtAudioFileOpenURL((CFURLRef)fileURL, &_extAudioFile)): error 1,685,348,671
[VERBOSE-2:dart_vm_initializer.cc(41)] Unhandled Exception: PlatformException(AudioWaveforms, Failed to prepare player, null, null)
#0 StandardMethodCodec.decodeEnvelope (package:flutter/src/services/message_codecs.dart:652:7)
#1 MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:310:18)
<asynchronous suspension>
#2 AudioWaveformsInterface.preparePlayer (package:audio_waveforms/src/base/audio_waveforms_interface.dart:100:18)
<asynchronous suspension>
#3 PlayerController.preparePlayer (package:audio_waveforms/src/controllers/player_controller.dart:122:24)
<asynchronous suspension>
[VERBOSE-2:dart_vm_initializer.cc(41)] Unhandled Exception: PlatformException(AudioWaveforms, Failed to decode audio file, null, null)
#0 StandardMethodCodec.decodeEnvelope (package:flutter/src/services/message_codecs.dart:652:7)
#1 MethodChannel._invokeMethod (package:flutter/src/services/platform_channel.dart:310:18)
<asynchronous suspension>
#2 AudioWaveformsInterface.extractWaveformData (package:audio_waveforms/src/base/audio_waveforms_interface.dart:163:9)
<asynchronous suspension>
#3 PlayerController.extractWaveformData (package:audio_waveforms/src/controllers/player_controller.dart:169:20)
<asynchronous suspension>
As I mentioned above , we also notice when audio is recorded from Android, there is no information of Duration,AudioChannels, SampleRate
. But when we recorded from iOS, there has. audio4.mp3
is the audio file of example/assets/audios/
folder and another is recorded audio file from android.
Here is the output of audio file from android and let me know if I was wrong.
Thanks.
And my flutter doctor -v
-
[✓] Flutter (Channel stable, 3.13.0, on macOS 14.0 23A344 darwin-arm64, locale
en-US)
• Flutter version 3.13.0 on channel stable at
/Users/tensor_lab/Documents/developer/flutter
• Upstream repository https://github.com/flutter/flutter.git
• Framework revision efbf63d9c6 (7 weeks ago), 2023-08-15 21:05:06 -0500
• Engine revision 1ac611c64e
• Dart version 3.1.0
• DevTools version 2.25.0
[✓] Android toolchain - develop for Android devices (Android SDK version 34.0.0)
• Android SDK at /Users/tensor_lab/Library/Android/sdk
• Platform android-34, build-tools 34.0.0
• ANDROID_HOME =
/Users/tensor_lab/.gem/bin:/Users/tensor_lab/.rbenv/shims:/Users/tensor_la
b/.rbenv/bin:/Users/tensor_lab/Downloads/apache-maven-3.9.2/bin:/opt/homeb
rew/bin:/opt/homebrew/bin:/opt/homebrew/sbin:/usr/local/bin:/System/Crypte
xes/App/usr/bin:/usr/bin:/bin:/usr/sbin:/sbin:/var/run/com.apple.security.
cryptexd/codex.system/bootstrap/usr/local/bin:/var/run/com.apple.security.
cryptexd/codex.system/bootstrap/usr/bin:/var/run/com.apple.security.crypte
xd/codex.system/bootstrap/usr/appleinternal/bin:/Library/Apple/usr/bin:/us
r/local/share/dotnet:~/.dotnet/tools:/Library/Frameworks/Mono.framework/Ve
rsions/Current/Commands:/Users/tensor_lab/Library/Application
Support/JetBrains/Toolbox/scripts:/Users/tensor_lab/Documents/developer/fl
utter/bin:/Users/tensor_lab/.pub-cache/bin:/Users/tensor_lab/Library/Andro
id/sdk
• Java binary at: /Applications/Android
Studio.app/Contents/jbr/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build
17.0.6+0-17.0.6b829.9-10027231)
• All Android licenses accepted.
[✓] Xcode - develop for iOS and macOS (Xcode 15.0)
• Xcode at /Applications/Xcode.app/Contents/Developer
• Build 15A240d
• CocoaPods version 1.12.1
[✓] Chrome - develop for the web
• Chrome at /Applications/Google Chrome.app/Contents/MacOS/Google Chrome
[✓] Android Studio (version 2022.3)
• Android Studio at /Applications/Android Studio.app/Contents
• Flutter plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/9212-flutter
• Dart plugin can be installed from:
🔨 https://plugins.jetbrains.com/plugin/6351-dart
• Java version OpenJDK Runtime Environment (build
17.0.6+0-17.0.6b829.9-10027231)
[✓] IntelliJ IDEA Ultimate Edition (version 2023.2.2)
• IntelliJ at /Applications/IntelliJ IDEA.app
• Flutter plugin version 75.1.4
• Dart plugin version 232.9559.10
[✓] VS Code (version 1.82.3)
• VS Code at /Applications/Visual Studio Code.app/Contents
• Flutter extension version 3.74.0
[✓] Connected device (5 available)
• CPH2363 (mobile) • dd688732 •
android-arm64 • Android 13 (API 33)
• Jack Will (mobile) • 00008030-0001086C0279802E • ios
• iOS 17.0.2 21A351
• iPhone 15 Pro Max (mobile) • EB3CEC64-DF04-4D3E-860B-914868DB92FE • ios
• com.apple.CoreSimulator.SimRuntime.iOS-17-0 (simulator)
• macOS (desktop) • macos •
darwin-arm64 • macOS 14.0 23A344 darwin-arm64
• Chrome (web) • chrome •
web-javascript • Google Chrome 116.0.5845.187
[✓] Network resources
• All expected network resources are available.
• No issues found!
iOS Physical device model is iPhone11.
Any update on that one? Currently, the lib seems completely broken as I sa a bunch of those issues here but no response. Does it work for anyone on iOS?