audio_waveforms icon indicating copy to clipboard operation
audio_waveforms copied to clipboard

AudioWaveforms with RecorderController does not show any waves

Open vongrad opened this issue 1 year ago • 10 comments

Describe the bug I cloned audio_waveforms GitHub repo, opened up the example project and ran it on iPhone 14 Pro (iOS 16.4) emulator, however the waves are shown when recording the audio. (I have not modified any of your code)

To Reproduce Steps to reproduce the behavior:

  1. Clone audio_waveforms repo
  2. Open up the example project
  3. Run it on iOS emulator (16.4)
  4. Start recording the sound
  5. No visual waves are shown, just dots with [x*t, 0]

Expected behavior The waves should react to recorded audio.

Smartphone (please complete the following information):

  • Device: iPhone 14 Pro (emulator)
  • OS: IOS 16.4

vongrad avatar Sep 14 '23 16:09 vongrad

https://github.com/SimformSolutionsPvtLtd/audio_waveforms/assets/26495484/6fb240cc-66a8-4e12-916d-6d275d60bb76

I have the same issue with my phone Phone Model: Realme 5 pro Android Version: 10

It works fine with the rest of emulators and devices ( Android & iOS )

hossam-96 avatar Dec 18 '23 14:12 hossam-96

I got the same issue, did you get any solutions?

Priyanshu85 avatar Jan 29 '24 12:01 Priyanshu85

@vongrad Can please use the new 1.0.5 version and #242 might have had fixed this.

Ujas-Majithiya avatar Mar 16 '24 18:03 Ujas-Majithiya

I am also experiencing this on a physical device, iOS 17.4.1. Latest version 1.0.5.

jt274 avatar Apr 18 '24 05:04 jt274

I've got the same problem using the new version on android

Heropowwa avatar May 03 '24 21:05 Heropowwa

I tested on an iPhone 14 simulator with IOS 17.2 recording waves are generating just fine for me. Did you get any exceptions or logs while recording? If you can share it then I would help us find the issue.

Ujas-Majithiya avatar May 05 '24 16:05 Ujas-Majithiya

This is the code of the widget that i use to record:

import 'package:audio_waveforms/audio_waveforms.dart';
import 'package:chat/Controller/AudioController.dart';
import 'package:flutter/material.dart';
import 'package:get/get.dart';

class RecordAudioMessage extends StatefulWidget {
  const RecordAudioMessage({super.key});

  @override
  State<RecordAudioMessage> createState() => _RecordAudioMessageState();
}

class _RecordAudioMessageState extends State<RecordAudioMessage> {
  AudioController audioController = Get.put(AudioController());

  @override
  void initState() {
    super.initState();
  }

  @override
  Widget build(BuildContext context) {
    return AudioWaveforms(
      size: Size(MediaQuery.of(context).size.width * 0.5, 25),
      recorderController: audioController.recorderController,
      enableGesture: true,
      waveStyle: const WaveStyle(
        extendWaveform: true,
        showMiddleLine: false,
      ),
    );
  }
}

This is the audio controller:

import 'dart:io';
import 'package:audio_waveforms/audio_waveforms.dart';
import 'package:chat/Controller/ProfileController.dart';
import 'package:get/get.dart';
import 'package:path_provider/path_provider.dart';
import 'package:permission_handler/permission_handler.dart';

class AudioController extends GetxController {
  RxString recordFilePath = "".obs;
  RxString audioUrl = "".obs;
  RxBool isRecording = false.obs;
  RxBool isPlayingMsg = false.obs;
  ProfileController profileController = Get.put(ProfileController());
  PlayerController playerController = PlayerController();
  late final RecorderController recorderController;
  late String? path;
  bool isInitialised = false;

  Future<String> getFilePath() async {
    Directory storageDirectory = await getApplicationDocumentsDirectory();
    String sdPath = "${storageDirectory.path}/record";
    var d = Directory(sdPath);
    if (!d.existsSync()) {
      d.createSync(recursive: true);
    }
    return "$sdPath/audio.mpeg4";
  }

  Future<bool> checkPermission() async {
    if (!await Permission.microphone.isGranted) {
      PermissionStatus status = await Permission.microphone.request();
      if (status != PermissionStatus.granted) {
        return false;
      }
    }
    return true;
  }

  void initialiseController() {
    if (isInitialised) {
    } else {
      recorderController = RecorderController()
        ..androidEncoder = AndroidEncoder.aac
        ..androidOutputFormat = AndroidOutputFormat.mpeg4
        ..iosEncoder = IosEncoder.kAudioFormatMPEG4AAC
        ..sampleRate = 44100
        ..bitRate = 48000;
      isInitialised = true;
    }
  }

  void startRecording() async {
    initialiseController();
    bool hasPermission = await checkPermission();
    if (hasPermission) {
      isRecording.value = true;
      recordFilePath.value = await getFilePath();
      await recorderController.record(path: recordFilePath.value);
    } else {
      //Toast error
    }
  }

  void stopRecording() async {
    final path = await recorderController.stop();
    isRecording.value = false;
    print(path);
  }
}

The device i use for testing is a Redmi Note 12 with Android 14. I haven't got any particular log while recording.

Heropowwa avatar May 06 '24 13:05 Heropowwa

The bitrate value of 48k falls outside of the supported range of mpeg4 audio format on Android. Can you please try a higher bitrate value?

Ujas-Majithiya avatar May 07 '24 08:05 Ujas-Majithiya

I tried this:

 void initialiseController() {
    if (isInitialised) {
    } else {
      recorderController = RecorderController()
        ..androidEncoder = AndroidEncoder.aac
        ..androidOutputFormat = AndroidOutputFormat.mpeg4
        ..iosEncoder = IosEncoder.kAudioFormatMPEG4AAC
        ..sampleRate = 44100
        ..bitRate = 128000;
      isInitialised = true;
    }
  }

But it still doesn't show waves while recording.

Screenshot_2024-05-07-13-43-41-466_com example chat

Heropowwa avatar May 07 '24 11:05 Heropowwa