crashing in iOS 18 beta
Describe the bug A clear and concise description of what the bug is.
To Reproduce Steps to reproduce the behavior: preparePlayer function gives an error: PlatformException(AudioWaveforms, Failed to prepare player, The operation couldn’t be completed. (OSStatus error 561017449.), null)
Expected behavior A clear and concise description of what you expected to happen.
Smartphone (please complete the following information):
- Device: iPhone 15 Pro max
- OS: iOS 18
- Version [e.g. 22]
Screenshots Xcode crash report
Additional context Add any other context about the problem here.
Did you find any Solution @Sergelenchimeg ?
If Yes Please Let e know! Thanks
Hello, @Sergelenchimeg @dikshit1400 if possible can you please provide more information or block of code because i have tested in IOS 18 and it's working well. Thank you
I have been facing the same issue in iOS , here is the base code.
void _initializeControllers() { recorderController = RecorderController() ..androidEncoder = AndroidEncoder.aac ..androidOutputFormat = AndroidOutputFormat.mpeg4 ..iosEncoder = IosEncoder.kAudioFormatMPEG4AAC ..sampleRate = 44100; }
await playerController .preparePlayer( path: path!, noOfSamples: 100, volume: 1.0, shouldExtractWaveform: true) .catchError((e) { printDebug(e.toString()); }); await playerController.startPlayer( finishMode: FinishMode.loop); },
Error: flutter: PlatformException(AudioWaveforms, Failed to prepare player, The operation couldn’t be completed. (OSStatus error 2003334207.), null)
I am getting same error
Hi @Sergelenchimeg , are you still encountering this issue? If so, could you please share the steps to reproduce it? Your input will help us investigate further. Thank you!
Yes i am still getting same issue
@korat2511 Can you please share the steps to reproduce the issue?
import 'dart:developer'; import 'dart:io'; import 'dart:ui' as ui;
import 'package:audio_waveforms/audio_waveforms.dart'; import 'package:flutter/material.dart'; import 'package:flutter_svg/svg.dart'; import 'package:path_provider/path_provider.dart'; import 'package:permission_handler/permission_handler.dart';
import '../../configuration/globals.dart'; import '../../configuration/theme.dart'; import '../widgets/custom_button.dart';
class CreateTask extends StatefulWidget { final String siteId;
const CreateTask({ super.key, required this.siteId, });
@override State<CreateTask> createState() => _CreateTaskState(); }
class _CreateTaskState extends State<CreateTask> {
String? voiceFilePath; bool isPlaying = false;
bool showRecordingBox = false;
late final RecorderController recorderController; late final PlayerController playerController; String? path; late Directory directory;
@override void initState() { super.initState(); _initialiseController(); playerController.onPlayerStateChanged.listen((state) { if (mounted && state == PlayerState.stopped) { setState(() { isPlaying = false; }); } }); }
void _initialiseController() async { recorderController = RecorderController() ..androidEncoder = AndroidEncoder.aac ..androidOutputFormat = AndroidOutputFormat.mpeg4 ..iosEncoder = IosEncoder.kAudioFormatMPEG4AAC ..sampleRate = 44100 ..bitRate = 128000;
playerController = PlayerController();
directory = await getApplicationDocumentsDirectory();
}
void _startRecording() async { try { if (await recorderController.checkPermission()) { await recorderController.record(path: path); if (mounted) { setState(() {}); } } else { var status = await Permission.microphone.request(); if (status != PermissionStatus.granted) { print('Microphone permission not granted'); return; } } } catch (e) { print('Error starting recording: $e'); } }
void _stopRecording() async { try { final recordedPath = await recorderController.stop(); log("recorder path == $recordedPath"); if (recordedPath != null && mounted) { log("before path $path voice $voiceFilePath"); setState(() { path = recordedPath; voiceFilePath = recordedPath; isPlaying = false; }); log("after path $path voice $voiceFilePath"); await playerController.preparePlayer( path: recordedPath, shouldExtractWaveform: true, ); } } catch (e) { print('Error stopping recording: $e'); } }
@override void dispose() { recorderController.dispose(); playerController.dispose(); super.dispose(); }
@override Widget build(BuildContext context) { final Size size = MediaQuery.of(context).size; return GestureDetector( onTap: () { FocusScope.of(context).unfocus(); }, child: Scaffold( appBar: AppBar( toolbarHeight: size.height * 0.06, backgroundColor: Globals.primaryColor, elevation: 2, shadowColor: Colors.black.withOpacity(0.5), leading: InkWell( onTap: () { setState(() { path = null; playerController.stopPlayer(); }); Navigator.pop(context); }, child: const Padding( padding: EdgeInsets.all(4.0), child: Icon(Icons.arrow_back_ios)), ), titleSpacing: 0, title: Padding( padding: const EdgeInsets.all(4.0), child: Text( "Create Task", maxLines: 1, overflow: TextOverflow.ellipsis, style: TextStyle( fontWeight: FontWeight.w600, fontSize: size.width * 0.046, ), ), ), ), body: Container( decoration: BoxDecoration(color: appThemeData().scaffoldBackgroundColor), height: double.infinity, width: double.infinity, padding: EdgeInsets.symmetric(horizontal: size.width * 0.05), child: Stack( children: [ Column( crossAxisAlignment: CrossAxisAlignment.start, children: [
Padding(
padding: EdgeInsets.symmetric(
vertical: size.height * 0.025,
),
child: Wrap(
spacing: 10,
runSpacing: 10,
children: [
GestureDetector(
onTap: () {
setState(() {
showRecordingBox = true;
});
_scrollToFocusedField(_focusNodes[0]);
},
child: Container(
padding: const EdgeInsets.symmetric(
horizontal: 5, vertical: 5),
decoration: BoxDecoration(
border:
Border.all(color: Globals.primaryColor),
borderRadius: BorderRadius.circular(10)),
child: Row(
mainAxisSize: MainAxisSize.min,
children: [
SvgPicture.asset(
"assets/svg/voice.svg",
height: 17,
width: 17,
),
const SizedBox(width: 5),
Text(
"Voice Notes",
style: TextStyle(
fontSize: 12,
color: Globals.primaryColor,
fontWeight: FontWeight.w500),
)
],
),
),
),
GestureDetector(
onTap: () {
showEmployeeDialog(context, size);
},
child: Container(
padding: const EdgeInsets.symmetric(
horizontal: 5, vertical: 5),
decoration: BoxDecoration(
border:
Border.all(color: Globals.primaryColor),
borderRadius: BorderRadius.circular(10)),
child: Row(
mainAxisSize: MainAxisSize.min,
children: [
SvgPicture.asset(
"assets/svg/user.svg",
height: 17,
width: 17,
),
const SizedBox(width: 5),
Text(
"Assign",
style: TextStyle(
fontSize: 12,
color: Globals.primaryColor,
fontWeight: FontWeight.w500),
)
],
),
),
),
GestureDetector(
onTap: () {
setState(() {
if (remarkController.text.isEmpty &&
showRemarkBox == true) {
showRemarkBox = false;
} else {
showRemarkBox = true;
WidgetsBinding.instance
.addPostFrameCallback((_) {
FocusScope.of(context)
.requestFocus(_focusNodes[1]);
_scrollToFocusedField(_focusNodes[1]);
});
}
});
},
child: Container(
padding: const EdgeInsets.symmetric(
horizontal: 5, vertical: 5),
decoration: BoxDecoration(
border:
Border.all(color: Globals.primaryColor),
borderRadius: BorderRadius.circular(10)),
child: Row(
mainAxisSize: MainAxisSize.min,
children: [
SvgPicture.asset(
"assets/svg/notes.svg",
height: 17,
width: 17,
),
const SizedBox(width: 5),
Text(
"Remark",
style: TextStyle(
color: Globals.primaryColor,
fontSize: 12,
fontWeight: FontWeight.w500),
)
],
),
),
),
GestureDetector(
onTap: () {
setState(() {
if (instructionsController.text.isEmpty &&
showInstructionBox == true) {
showInstructionBox = false;
} else {
showInstructionBox = true;
WidgetsBinding.instance
.addPostFrameCallback((_) {
FocusScope.of(context)
.requestFocus(_focusNodes[2]);
_scrollToFocusedField(_focusNodes[2]);
});
}
});
},
child: Container(
padding: const EdgeInsets.symmetric(
horizontal: 5, vertical: 5),
decoration: BoxDecoration(
border:
Border.all(color: Globals.primaryColor),
borderRadius: BorderRadius.circular(10)),
child: Row(
mainAxisSize: MainAxisSize.min,
children: [
SvgPicture.asset(
"assets/svg/notes.svg",
height: 17,
width: 17,
),
const SizedBox(width: 5),
Text(
"Instructions",
style: TextStyle(
color: Globals.primaryColor,
fontSize: 12,
fontWeight: FontWeight.w500),
)
],
),
),
),
],
),
),
(showRecordingBox == true)
? Column(
children: [
buildVoiceNoteSection(size),
spaceBetweenField(),
],
)
: Container(),
SizedBox(
height: MediaQuery.of(context).padding.bottom +
size.height * 0.1,
),
],
),
Align(
alignment: Alignment.bottomCenter,
child: _isCreatingTask
? Padding(
padding: EdgeInsets.only(
left: size.width * 0.1,
right: size.width * 0.1,
bottom: MediaQuery.of(context).padding.bottom + 20,
),
child: SizedBox(
width: size.width,
height: size.height * 0.055,
child: Container(
decoration: BoxDecoration(
borderRadius: BorderRadius.circular(10),
color: Globals.primaryColor),
child: Center(
child: CircularProgressIndicator(
color: Colors.white,
)),
),
),
)
: CustomButton(
text: 'Create Task',
onPressed: () => _createTask(context),
),
),
],
),
),
),
);
}
Widget buildVoiceNoteSection(s) { return Column( children: [ Container( width: double.infinity, height: 170, padding: EdgeInsets.all(16), decoration: BoxDecoration( color: Colors.grey[200], borderRadius: BorderRadius.circular(12), ), child: Column( mainAxisAlignment: MainAxisAlignment.center, children: [ Container( height: 70, child: _buildWaveform(), ), SizedBox(height: 20), _buildControls(), ], ), ), ], ); }
Widget _buildWaveform() { if (recorderController.isRecording) { return AudioWaveforms( enableGesture: true, size: Size(MediaQuery.of(context).size.width - 32, 70), recorderController: recorderController, waveStyle: WaveStyle( gradient: ui.Gradient.linear( const Offset(70, 50), Offset(MediaQuery.of(context).size.width / 2, 0), [Colors.red, Colors.green], ), ), ); } else if (path != null) { return AudioFileWaveforms( enableSeekGesture: true, size: Size(MediaQuery.of(context).size.width - 32, 70), playerController: playerController, playerWaveStyle: const PlayerWaveStyle( scaleFactor: 0.8, fixedWaveColor: Colors.white30, liveWaveColor: Colors.white, waveCap: StrokeCap.butt, ), ); } else { return Container( alignment: Alignment.center, child: Text('Tap the microphone to start recording', style: TextStyle(color: Colors.grey[600])), ); } }
void _playandPause() async { log("Is Playing == $isPlaying"); try { if (isPlaying) { log("IP"); await playerController.pausePlayer(); } else { if (playerController.playerState == PlayerState.stopped) { await playerController.preparePlayer( path: path!, shouldExtractWaveform: true, ); } await playerController.startPlayer( // finishMode: FinishMode.stop, forceRefresh: true, ); } if (mounted) { setState(() { isPlaying = !isPlaying; }); } } catch (e) { print('Error playing/pausing: $e'); } }
Widget _buildControls() { if (recorderController.isRecording) { return GestureDetector( onTap: _stopRecording, child: Container( width: 40, height: 40, decoration: BoxDecoration( shape: BoxShape.circle, color: Colors.red, ), child: Icon( Icons.stop, color: Colors.white, size: 24, ), ), ); } else if (path != null) { return Row( mainAxisAlignment: MainAxisAlignment.center, children: [ ElevatedButton.icon( onPressed: _playandPause, icon: Icon(isPlaying ? Icons.pause : Icons.play_arrow), label: Text(isPlaying ? 'Pause' : 'Play'), style: ElevatedButton.styleFrom( backgroundColor: Globals.primaryColor, foregroundColor: Colors.white, ), ), SizedBox(width: 20), ElevatedButton.icon( onPressed: () { if (mounted) { setState(() { path = null; playerController.stopPlayer(); }); } }, icon: Icon(Icons.delete), label: Text('Delete'), style: ElevatedButton.styleFrom( backgroundColor: Colors.red, foregroundColor: Colors.white, ), ), ], ); } else { return GestureDetector( onTap: _startRecording, child: Container( width: 40, height: 40, decoration: BoxDecoration( shape: BoxShape.circle, color: Globals.primaryColor, ), child: Icon( Icons.mic, color: Colors.white, size: 24, ), ), ); } }
}
@korat2511 Code snippet is hard to review. Could you simply let me know which parameters need to be passed for the issue to occur?
late final RecorderController recorderController; late final PlayerController playerController; String? path; late Directory directory;
Widget buildVoiceNoteSection(s) { return Column( children: [ Container( width: double.infinity, height: 170, padding: EdgeInsets.all(16), decoration: BoxDecoration( color: Colors.grey[200], borderRadius: BorderRadius.circular(12), ), child: Column( mainAxisAlignment: MainAxisAlignment.center, children: [ Container( height: 70, child: _buildWaveform(), ), SizedBox(height: 20), _buildControls(), ], ), ), ], ); }
Widget _buildWaveform() { if (recorderController.isRecording) { return AudioWaveforms( enableGesture: true, size: Size(MediaQuery.of(context).size.width - 32, 70), recorderController: recorderController, waveStyle: WaveStyle( gradient: ui.Gradient.linear( const Offset(70, 50), Offset(MediaQuery.of(context).size.width / 2, 0), [Colors.red, Colors.green], ), ), ); } else if (path != null) { return AudioFileWaveforms( enableSeekGesture: true, size: Size(MediaQuery.of(context).size.width - 32, 70), playerController: playerController, playerWaveStyle: const PlayerWaveStyle( scaleFactor: 0.8, fixedWaveColor: Colors.white30, liveWaveColor: Colors.white, waveCap: StrokeCap.butt, ), ); } else { return Container( alignment: Alignment.center, child: Text('Tap the microphone to start recording', style: TextStyle(color: Colors.grey[600])), ); } }
void _playandPause() async { log("Is Playing == $isPlaying"); try { if (isPlaying) { log("IP"); await playerController.pausePlayer(); } else { if (playerController.playerState == PlayerState.stopped) { await playerController.preparePlayer( path: path!, shouldExtractWaveform: true, ); } await playerController.startPlayer( // finishMode: FinishMode.stop, forceRefresh: true, ); } if (mounted) { setState(() { isPlaying = !isPlaying; }); } } catch (e) { print('Error playing/pausing: $e'); } }
Widget _buildControls() { if (recorderController.isRecording) { return GestureDetector( onTap: _stopRecording, child: Container( width: 40, height: 40, decoration: BoxDecoration( shape: BoxShape.circle, color: Colors.red, ), child: Icon( Icons.stop, color: Colors.white, size: 24, ), ), ); } else if (path != null) { return Row( mainAxisAlignment: MainAxisAlignment.center, children: [ ElevatedButton.icon( onPressed: _playandPause, icon: Icon(isPlaying ? Icons.pause : Icons.play_arrow), label: Text(isPlaying ? 'Pause' : 'Play'), style: ElevatedButton.styleFrom( backgroundColor: Globals.primaryColor, foregroundColor: Colors.white, ), ), SizedBox(width: 20), ElevatedButton.icon( onPressed: () { if (mounted) { setState(() { path = null; playerController.stopPlayer(); }); } }, icon: Icon(Icons.delete), label: Text('Delete'), style: ElevatedButton.styleFrom( backgroundColor: Colors.red, foregroundColor: Colors.white, ), ), ], ); } else { return GestureDetector( onTap: _startRecording, child: Container( width: 40, height: 40, decoration: BoxDecoration( shape: BoxShape.circle, color: Globals.primaryColor, ), child: Icon( Icons.mic, color: Colors.white, size: 24, ), ), ); } }
Any solution on it ?
@korat2511 Can you please send me the video as well as the stack trace of the issue? The reason is on our end, we can't reproduce it.
Please share your mail id so i can share video to you
@korat2511 You can share the stack trace and video here. So everyone can be on the same page.
video is too long that's why
On Fri, Jan 3, 2025 at 3:11 PM Manoj @.***> wrote:
@korat2511 https://github.com/korat2511 You can share the stack trace and video here. So everyone can be on the same page.
— Reply to this email directly, view it on GitHub https://github.com/SimformSolutionsPvtLtd/audio_waveforms/issues/327#issuecomment-2568948740, or unsubscribe https://github.com/notifications/unsubscribe-auth/A75KZJONHAIYGTYMX7Y55G32IZLNNAVCNFSM6AAAAABLNUH7TGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRYHE2DQNZUGA . You are receiving this because you were mentioned.Message ID: @.***>
https://drive.google.com/file/d/1wM715-vmvy_qtaqEZZXi8W3pRsj4KWyC/view?usp=sharing
On Fri, Jan 3, 2025 at 3:13 PM RUTVIK KORAT @.***> wrote:
video is too long that's why
On Fri, Jan 3, 2025 at 3:11 PM Manoj @.***> wrote:
@korat2511 https://github.com/korat2511 You can share the stack trace and video here. So everyone can be on the same page.
— Reply to this email directly, view it on GitHub https://github.com/SimformSolutionsPvtLtd/audio_waveforms/issues/327#issuecomment-2568948740, or unsubscribe https://github.com/notifications/unsubscribe-auth/A75KZJONHAIYGTYMX7Y55G32IZLNNAVCNFSM6AAAAABLNUH7TGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRYHE2DQNZUGA . You are receiving this because you were mentioned.Message ID: @.***>
Any update on it ?
On Fri, Jan 3, 2025 at 3:39 PM RUTVIK KORAT @.***> wrote:
https://drive.google.com/file/d/1wM715-vmvy_qtaqEZZXi8W3pRsj4KWyC/view?usp=sharing
On Fri, Jan 3, 2025 at 3:13 PM RUTVIK KORAT @.***> wrote:
video is too long that's why
On Fri, Jan 3, 2025 at 3:11 PM Manoj @.***> wrote:
@korat2511 https://github.com/korat2511 You can share the stack trace and video here. So everyone can be on the same page.
— Reply to this email directly, view it on GitHub https://github.com/SimformSolutionsPvtLtd/audio_waveforms/issues/327#issuecomment-2568948740, or unsubscribe https://github.com/notifications/unsubscribe-auth/A75KZJONHAIYGTYMX7Y55G32IZLNNAVCNFSM6AAAAABLNUH7TGVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDKNRYHE2DQNZUGA . You are receiving this because you were mentioned.Message ID: @.*** com>
@korat2511 I think there is some mistake in the implementation, but I am unable to debug the issue.
Due to inactivity I will be closing this issue. Feel free create a new issue with reproducible steps.