just_audio icon indicating copy to clipboard operation
just_audio copied to clipboard

Rewrite iOS implementation based on AVAudioEngine

Open ryanheise opened this issue 3 years ago • 57 comments

Is your feature request related to a problem? Please describe.

Certain features such as a visualizer (#97 ), equalizer (#147 ), pitch shifting (#329 ) may be more easily implemented if based on AVAudioEngine rather than the current AVQueuePlayer.

Describe the solution you'd like

Either reimplement using AVAudioEngine directly, or use an AVAudioEngine-based library such as AudioKit.

Describe alternatives you've considered

We can get some of the way there by plugging an audio tap processor into AVQueuePlayer, but unfortunately this does not give us access to iOS's built-in pitch shifting API which is only available via AVAudioEngine. This could still be possible via the audio tap processor by manually implementing the pitch shifting algorithm, or perhaps integrating the C version of sonic, but long term an AVAudioEngine-based implementation may end up being more flexible in terms of implementing other audio processing features.

Additional context

None

UPDATE

This will be a collaborative effort. Anyone can contribute, and we will be following the plan in this shared Google Doc:

https://docs.google.com/document/d/17EZEvmiyn94GCwddBGS5BAaYer5BTRFv-ENIAPG-WG4/edit?usp=sharing

We are currently in the research phase, so you can contribute by sharing any relevant links to useful resources you have found to help implement each feature in the list.

ryanheise avatar Mar 10 '21 10:03 ryanheise

Is it an option to move to Swift?

mvolpato avatar Mar 10 '21 12:03 mvolpato

Yes, this probably will be written in Swift.

ryanheise avatar Mar 10 '21 12:03 ryanheise

this will be breaking for folks, don't do this

nt4f04uNd avatar Mar 10 '21 14:03 nt4f04uNd

@nt4f04uNd Do you mean breaking in terms of stability, or breaking in terms of codec support?

ryanheise avatar Mar 10 '21 14:03 ryanheise

in terms of that you can have these configs, except one

| app | library | | obj-c | obj-c | | swift | obj-c | | swift | swift |

library written with swift can't be used with obj-c projects

nt4f04uNd avatar Mar 10 '21 14:03 nt4f04uNd

That's true, although just to play devil's advocate:

  • New Flutter projects have been Swift by default now for more than a year.
  • People are converting older Objective C projects to Swift.
  • Consequently, this is a vanishing problem, and people who have it can solve it with flutter create.

This will also virtually only affect the more experienced Flutter developers (who created their project before Flutter changed the default template to Swift), so if we include instructions in the README on how to convert their project from Objective C to Swift, I wouldn't expect Swift to be a showstopper.

It would be interesting to know the actual statistics on how many people are still running Objective C projects. Plugins are a different story, as people may have various reasons to choose Objective C vs Swift when writing their plugin in that language. But a project doesn't actually contain any Objective C or Swift code so it really doesn't make a difference whether you switch a project from Objective C to Swift except that it just opens the door to accessing all of the Swift plugins on pub.dev.

There are some instructions in the README for Android on how to convert old projects to the latest V2 plugin architecture, and in a similar vein I could add instructions for how to update an old Objective C project to Swift.

ryanheise avatar Mar 10 '21 15:03 ryanheise

(Copying this comment from another issue to get broader interest)

Is this supported on iOS currently?

The waveform visualizer is implemented on iOS but not pitch. You can track the pitch feature here: #329

There is a big question at this point whether to continue with the current AVQueuePlayer-based implementation or switch to an AVAudioEngine-based implementation. For pitch scaling, I really want to take advantage of AVAudioEngine's built-in features, but that requires a rewrite of the iOS side - see #334 and this is a MUCH bigger project.

I would really like to see an AVAudioEngine-based solution see the light of day, but it will probably not happen if I work on it alone. If anyone would like to help, maybe we can pull it off with some solid open source teamwork. One of the attractive solutions is to use AudioKit which is a library built on top of AVAudioEngine which also provides access to pitch adjustment AND provides a ready-made API for a visualizer and equalizer. That is, it provides us with everything we need - BUT it is written in Swift and so that involves a language change and it means we may need to deal with complaints that old projects don't compile (we'd need to provide extra instructions on how to update their projects to be Swift-compatible).

Would anyone like to help me with this? (Please reply on #334)

ryanheise avatar Apr 01 '21 00:04 ryanheise

Another interesting library: https://github.com/tanhakabir/SwiftAudioPlayer

This suggests we want a combination of AVAudioEngine and AudioToolbox:

Thus, using AudioToolbox, we are able to stream audio and convert the downloaded data into usable data for the AVAudioEngine to play. For an overview of our solution check out our blog post.

Given that I'm spread a bit thinly, I would not like to make this a solo effort, so I will wait until some more people post below who might be willing to team up and work together.

One other thought I had is that I may want to move the current iOS implementation out into its own federated plugin implementation rather than bundling it with the main plugin, although it can still be endorsed so the dependency automatically gets added to your app. The advantage of this is that if we can eventually create an AVAudioEngine-based implementation of the just_audio API, we can now do this without throwing away the old implementation. In case there are some features that don't work in the new implementation, users will still be able to use the old implementation, and vice versa.

ryanheise avatar Apr 02 '21 03:04 ryanheise

blacklist the https://github.com/tanhakabir/SwiftAudioPlayer. i tried to use it and it's horrible, the audio glitches and author barely maintains it if you are researching variants of libraries you could use, i saw https://github.com/tumtumtum/StreamingKit (it's FOSS by the way)

imho you should just stick to the most popular one, i.e. AudioKit

nt4f04uNd avatar Apr 02 '21 04:04 nt4f04uNd

i like the idea of federated implemenation, in fact, this should be done for each platform

nt4f04uNd avatar Apr 02 '21 04:04 nt4f04uNd

blacklist the https://github.com/tanhakabir/SwiftAudioPlayer. i tried to use it and it's horrible, the audio glitches and author barely maintains it if you are researching variants of libraries you could use, i saw https://github.com/tumtumtum/StreamingKit (it's FOSS by the way)

imho you should just stick to the most popular one, i.e. AudioKit

I agree with you although what I did find interesting about SwiftAudioPlayer is not that I want to use it as a library, but rather that the author has written an informative blog post explaining the additional components that were needed on top of AVAudioEngine, which may also apply to us even if we go with AudioKit. For example, to stream audio, we will probably need to use techniques similar to those used in SwiftAudioPlayer to first "get" the audio to then feed into AudioKit.

Thanks for sharing StreamingKit, the name itself implies that it will be another good reference when trying to implement the streaming part of this.

ryanheise avatar Apr 02 '21 06:04 ryanheise

Hi @ryanheise, I would be happy to help but I do have some doubts:

  1. I have no experience with development of audio components.
  2. I have no experience with developing a plugin for Flutter.

I do have experience with Swift, and that would be my preference over Objective-C.

So I would be more comfortable with a list of clear (smaller) tasks to follow, and not just "Implement AudioKit" or similar. I do not know if this is what you are looking for.

mvolpato avatar Apr 02 '21 09:04 mvolpato

fyi StreamingKit is not just for streaming, it to my understanding offers pretty much the same set of features as AudioKit

nt4f04uNd avatar Apr 03 '21 01:04 nt4f04uNd

@mvolpato I'm happy to have your support! We definitely need a plan which can be broken down into tasks that can be each done by different people.

What I'll do is create a Google Doc with a list of features that need to be supported, and the first thing that needs to be done is to just collecting links to relevant documentation and tutorials or StackOverflow answers relevant to implementing that feature. That first research phase will give us the peaces of the puzzle we need to then prioritise the tasks and start implementing them.

So the plan is:

  1. Create a list of features to implement
  2. Collect research for each feature
  3. Work out what needs to be done first
  4. Start implementing

The 3rd point is not in strict order so we can start thinking about that earlier, but I think the order in which to do things will fall out naturally. e.g. First we should implement loading a simple audio source, then playing, then pausing, then the other standard controls. State broadcasting could happen in parallel with this.

I'll also need to move the current iOS implementation into a separate platform implementation package.

I'll post a link to the doc once I create it.

ryanheise avatar Apr 03 '21 01:04 ryanheise

Shared doc: https://docs.google.com/document/d/17EZEvmiyn94GCwddBGS5BAaYer5BTRFv-ENIAPG-WG4/edit?usp=sharing

I will update the top post with details.

ryanheise avatar Apr 03 '21 02:04 ryanheise

Hi @mvolpato I'm ready to get ball rolling using a Swift-based implementation. Swift is nice, but there seem to be complications in how the compiler works.

Would you be able to try these steps out below in your environment and see if they work for you?

First, create a new plugin from the template:

flutter create -t plugin --platforms android,ios audiokit

Then in ios/SwiftAudiokitPlugin.swift I import AudioKit and make a symbolic reference:

import AudioKit
...
var mixer: AKMixer

Then in ios/audiokit.podspec I added the cocoapods dep. Tried various alternatives, none of which worked, though:

  s.dependency 'AudioKit/Core', '~> 4.0'
  # s.dependency 'AudioKit/Core', '4.11'
  # s.dependency 'AudioKit/Core', :git=>'https://github.com/AudioKit/AudioKit/', :branch => 'v5-develop'

If you try to run the example directory, you get the error mentioned here: https://github.com/AudioKit/AudioKit/issues/2267

Would be interested if you could try the above steps also and see if it works for you. Note that I also upgraded my cocoapods and Xcode to the latest versions before running this.

ryanheise avatar Apr 12 '21 06:04 ryanheise

If it also fails for you (and a solution doesn't start out) then we may need to create either a Flutter issue or an AudioKit issue. According to reports in the above AudioKit issue, the error was resolved, so maybe its an issue that occurs due to the Flutter setup specifically.

ryanheise avatar Apr 12 '21 06:04 ryanheise

'AudioKit/Core', :git=>'https://github.com/AudioKit/AudioKit/', :branch => 'v5-develop'

It looks like cocoapods is not supported (yet) for version 5, so this will not work for sure.

I also cannot get it to work. I tried different versions.

mvolpato avatar Apr 12 '21 07:04 mvolpato

It looks like this other plugin has it working. I did not have time to investigate their approach yet. I will check later today.

mvolpato avatar Apr 12 '21 07:04 mvolpato

Great find! I think what's happening on my project (and my environment) is that it resolving AudioKit to 4.11 whereas flutter_sequencer is resolving it to 4.11.1, and according to that issue, the bug fix for it requires 4.11.1 or later.

I was scratching my head for a while but it was caching the first version it ever resolved to when I made my first attempt at the podspec, even after running flutter clean. Turns out flutter clean does not delete the Pods directory and also doesn't delete the Podfile.lock file. This is why flutter_sequencer ends up resolving the AudioKit dependency to 4.11.1 when there's an even later version 4.11.2 available. Anyway, deleted everything and with a fresh start it works!

I think next it will probably be necessary to move the current iOS implementation into a separate package like the web one, then the new AudioKit implementation can just be another alternative to that.

ryanheise avatar Apr 12 '21 15:04 ryanheise

Ok, I will start working on this, but

I think next it will probably be necessary to move the current iOS implementation into a separate package like the web one, then the new AudioKit implementation can just be another alternative to that.

I do not get it why this is needed? Would that mean that users will be able to choose between the current implementation with just_audio and the AudioKit one with just_audio_ios_audiokit?

mvolpato avatar Apr 12 '21 19:04 mvolpato

@ryanheise in this branch I just copied the iOS implementation in a separated folder. Let me know if this is what you meant. I will remove Android code from it.

mvolpato avatar Apr 12 '21 19:04 mvolpato

Hi @mvolpato I actually haven't fully experimented with the different approaches, but the goal is to use the federated plugin architecture to allow people to opt in to the AudioKit implementation. This is because in the beginning we can expect it to lack features and stability compared to the original implementation, even if it may also implement some new features, so people will then be able to choose the federated implementation they prefer.

But maybe the mechanism we need to do this is possible even without moving the iOS implementation into a separate package (I have not tested that). It could be worth trying that first and if it fails then move the iOS implementation into a separate package. In the latter case, it should also be endorsed in the main just_audio pubspec.yaml file in the same way the web implementation is endorsed.

Also I will probably create a new branch just for AudioKit development, and that in turn will branch off dev. I will try to set this up some time today.

ryanheise avatar Apr 13 '21 01:04 ryanheise

If we separate the AudioKit based plugin from the rest (like the web one) then branching is not really necessary.

I will start with a new plugin, following the setup of the web one. Then I will either PR against the new branch or the old dev.

mvolpato avatar Apr 13 '21 07:04 mvolpato

the doc is public (i don't know if this is intended) i added yet another engine solution there https://github.com/sbooth/SFBAudioEngine/

nt4f04uNd avatar Apr 13 '21 13:04 nt4f04uNd

It's intended :-) Thanks for adding another resource!

ryanheise avatar Apr 13 '21 13:04 ryanheise

I think next it will probably be necessary to move the current iOS implementation into a separate package like the web one, then the new AudioKit implementation can just be another alternative to that.

While this is theoretically possible, we do lose the ability to symlink the iOS and macOS implementations because pub has a security restriction that prevents symlinks outside the project root, and we'd now want to share code across the separate just_audio_macos and just_audio_ios projects.

Maybe instead of having a separate package for the ios and macos platforms, we should have just one, just_audio_darwin, which includes both the iOS and Android platform implementations within the same project. Then we could have just_audio_audiokit_darwin for the alternative implementation, again containing both the ios and macos implementations.

ryanheise avatar Apr 13 '21 14:04 ryanheise

As a user of the package, how would you select in the pubspec which ios version to use? Does it matter if you want to use the current Android implementation and the new AudioKit for iOS?

mvolpato avatar Apr 13 '21 15:04 mvolpato

I'm looking into this now, but it seems the federated architecture doesn't fully support what we ant to do anyway.

Although the federated plugin design doc indicates that there is such a thing as a "default" implementation for a platform, implying that it can be overridden, it seems that isn't actually supported. If there is no default, an app can just add just_audio_audiokit to its dependencies and it will work. But if we try to provide a default, both end up getting added and things are non-deterministic.

I have created an issue to request at least this problem to be fixed: https://github.com/flutter/flutter/issues/80374

But as to your question of Android, the Android implementation would not be in the same package as the iOS/macOS package, so you would theoretically be able to vary these independently (subject to the problem of overriding defaults being resolved)

ryanheise avatar Apr 13 '21 15:04 ryanheise

There is a solution being worked on by the flutter team which will probably allow the default implementation to stay within the main just_audio plugin. So it should be fine to create a new directory alongside just_audio_web called just_audio_audiokit. It won't work until that solution is ready, but to workaround that in the short term we can just temporarily comment out the default iOS implementation's registration code:

+ (void)registerWithRegistrar:(NSObject<FlutterPluginRegistrar>*)registrar {
    /*
    FlutterMethodChannel* channel = [FlutterMethodChannel
        methodChannelWithName:@"com.ryanheise.just_audio.methods"
              binaryMessenger:[registrar messenger]];
    JustAudioPlugin* instance = [[JustAudioPlugin alloc] initWithRegistrar:registrar];
    [registrar addMethodCallDelegate:instance channel:channel];
    */
}

The audiokit plugin also needs to have a different class name from the default plugin because the flutter tool will still try to add both to the generated plugin registrant and they will conflict if two plugin classes have the same name.

So maybe we can call the built-in one JustAudioPlugin and the AudioKit one JustAudioAudioKitPlugin.

ryanheise avatar Apr 14 '21 16:04 ryanheise

So maybe we can call the built-in one JustAudioPlugin and the AudioKit one JustAudioAudioKitPlugin.

JustAudioKitPlugin :)

Ok, i will look into it in the coming days

mvolpato avatar Apr 14 '21 18:04 mvolpato

Unfortunately, or maybe that is a good thing with everything that is going on now in the world, I am very busy with my day job, and I had no time to look into this. :(

mvolpato avatar May 29 '21 08:05 mvolpato

No problem at all! Hopefully soon I'll be able to take a crack at getting something started.

ryanheise avatar May 29 '21 08:05 ryanheise

Has there been any updates on this?

NelsonJTSM avatar Jul 31 '21 21:07 NelsonJTSM

Instead of using iOS's native libraries, why don't we use something like libvlc? There's already work being done with it for desktop support, and libvlc has advantages such as better codec support (OPUS and Vorbis). In theory, it could mean one common backend for just_audio with no dependency on platform-specific libraries. Of course, it would be a major update that changes a lot.

jmshrv avatar Sep 06 '21 19:09 jmshrv

I think the main stopper for this is its LGPL license https://github.com/ryanheise/just_audio/issues/103#issuecomment-694326374

nt4f04uNd avatar Sep 11 '21 23:09 nt4f04uNd

@ryanheise I was wondering if there is already work done for making this possible?

toonvanstrijp avatar Nov 12 '21 14:11 toonvanstrijp

@ToonvanStrijp thanks for the ping. I think the main issue for now is just that I would like to have multiple iOS implementations but I don't think there is yet a way in Flutter's federated plugin model to set a default iOS implementation and then allow an app to choose an override implementation, specifically when the implementation uses method channels (although the Flutter team will eventually add this).

I think it would still be possible to start development on this but just delay merging it until that Flutter issue is sorted out. However, I am continually distracted by other issues, and bug fixes are always taken as higher priority. Hopefully I (or someone) can get a foundation started so that it is easier for others to start contributing.

Finally, when I first started experimenting with AudioKit, it was with version 4.x. But now that AudioKit 5.x is out and recommended, I should probably scrap what I had started. I remember at the time 5.x was actually just released but they didn't have it on cocoapods yet (because the developer wasn't a fan of cocoapods). Fortunately 5.x is now on cocoapods.

ryanheise avatar Nov 12 '21 16:11 ryanheise

@ryanheise I started working on rewriting this library to Swift. Let me know if you're interested in merging this. Because I think if we would move to Swift a lot more developers can collaborate. (Since Swift is more the "standard" nowadays).

I'll also start looking at AudioKit 5.0, but I'm not a iOS developer. So any tips on how to structure things on the iOS side is welcome!

toonvanstrijp avatar Nov 12 '21 17:11 toonvanstrijp

@ryanheise one more question regarding AudioKit. Right now we use AVQueuePlayer this handles all downloading and buffering build-in right? If we're switching to AudioKit or AVAudioEngine we need to take care of this ourselves?

toonvanstrijp avatar Nov 12 '21 17:11 toonvanstrijp

From a totally outsider perspective, wouldn't it be easier to use a package that handles downloading/buffering for us?

Also, one issue with most iOS libraries is that they use Apple's decoding stuff, which doesn't support OPUS/Vorbis. For my use case, it's kind of annoying. I was vaguely looking into making a gstreamer backend for all platforms, but it could never be the main implementation because it's LGPL. VLCKit also won't work for the same reason.

jmshrv avatar Nov 12 '21 17:11 jmshrv

Also, one issue with most iOS libraries is that they use Apple's decoding stuff, which doesn't support OPUS/Vorbis.

I just glanced over listed options in the google doc and it seems that https://github.com/sbooth/SFBAudioEngine is the only one that supports OPUS. Vobris is not supported by none of them though, which is probably ok, given that it's a predecessor of OPUS.

nt4f04uNd avatar Nov 12 '21 19:11 nt4f04uNd

Vobris is not supported by none of them though, which is probably ok, given that it's a predecessor of OPUS.

Actually, it supports Vobris as well

nt4f04uNd avatar Nov 12 '21 19:11 nt4f04uNd

That library looks great, although I won't really be able to contribute to this as I have no experience with native iOS :(

jmshrv avatar Nov 12 '21 19:11 jmshrv

@ryanheise I started working on rewriting this library to Swift. Let me know if you're interested in merging this.

I'd definitely go with Swift for the AudioKit-based implementation since AudioKit itself is written in Swift.

Rewriting the current AVQueuePlayer implementation in Swift is something I'm a bit more hesitant to do right now since this is the principle iOS implementation and such a large scale rewrite is likely to introduce stability issues. Any rewriting of it should probably be planned and discussed in order to avoid that happening. I think we can also delay it until at least a while after the AudioKit-based implementation starts becoming usable because my hope is that that implementation could eventually replace the AVQueuePlayer implementation (meaning the effort in rewriting it would be wasted.)

I'll also start looking at AudioKit 5.0, but I'm not a iOS developer. So any tips on how to structure things on the iOS side is welcome!

I've done a quick experiment with AudioKit by submitting a PR to the sound_generator so you might get some ideas by looking at it. Just a couple of notes to keep in mind:

  1. Since we don't yet have an easy way of overriding the default iOS implementation, My approach would be to first do a git mv to rename the old ios/macos directories then recreate them with flutter create for the desired platforms and languages. This will also set up the necessary bridge between Swift and Objective C. Eventually, though, I'd like this new AudioKit-based implementation to be in its own directory as a federated plugin implementation.
  2. We will also need a reasonable way to share code between the macOS and iOS implementations. I haven't looked into how to do that yet for Swift. The way I did it for the Objective C implementation is via symlinks, but there are Swift approaches that you can find by looking at other Flutter plugins that are written in Swift. It's unfortunate, but the official 1st party Flutter plugins don't bother to reuse code, they instead have the iOS implementation in Objective C and the macOS implementation reimplemented in Swift.

Also, one issue with most iOS libraries is that they use Apple's decoding stuff, which doesn't support OPUS/Vorbis.

I just glanced over listed options in the google doc and it seems that https://github.com/sbooth/SFBAudioEngine is the only one that supports OPUS. Vobris is not supported by none of them though, which is probably ok, given that it's a predecessor of OPUS.

Just a general comment here, but keeping in line with the vision of supporting multiple federated implementations of the just_audio platform interface, it is no problem if anyone wants to write an SFBAudioEngine-based implementation (which, although MIT, will still potentially involve LGPL if you use those parts that have that license) or GStreamer, etc.

I think when it comes to iOS, different people may end up needing these choices. For example, those building apps where audio processing is important (pitch shifting, time stretching, etc.) may want the AudioKit implementation, while those certain other formats might use a GStreamer or VLC-based implementation .

@ryanheise one more question regarding AudioKit. Right now we use AVQueuePlayer this handles all downloading and buffering build-in right? If we're switching to AudioKit or AVAudioEngine we need to take care of this ourselves?

I think that's an issue with a lot of these alternatives to AVQueuePlayer, yes we will have to manage a lot more ourselves. But at the same time, I'm running into limitations of AVQueuePlayer precisely because it manages buffering in a way I don't like, and so on the other hand there is a benefit to managing things ourselves.

ryanheise avatar Nov 13 '21 04:11 ryanheise

@ryanheise I've done a small setup as you've explained. Could you check in on this and let me know if this is the correct setup? https://github.com/wavy-assistant/just_audio/tree/feature/new_ios_implementation

toonvanstrijp avatar Nov 13 '21 14:11 toonvanstrijp

Hi @ToonvanStrijp this seems like a reasonable start to me. Thanks for taking the initiative!

One thing strange in GitHub's diff is this:

just_audio/macos/Classes/JustAudioPlugin.m → just_audio/ios_old/Classes/JustAudioPlugin.m 

I wonder if git mv really did something strange or whether that's just GitHub having problems trying to display it.

I notice the macOS podspec lists an older version of macOS than previously.

But I think any niggling issues will likely show up once implementation starts.

ryanheise avatar Nov 13 '21 15:11 ryanheise

@ryanheise I think it's an issue with Github displaying the diff.

One question before I get started. Would it be a good approach to keep the class structures and files like we have right now with the objective-c code? We're using ConcatenatingAudioSource etc. Are those still usable for ios when using AudioKit and what are your thoughts on approaching this?

toonvanstrijp avatar Nov 13 '21 16:11 toonvanstrijp

If you implement according to the just_audio platform interface, those structures will naturally come out in your design.

ryanheise avatar Nov 14 '21 01:11 ryanheise

@ryanheise I've a few more questions regarding the new implementation. I'm now working on the load function. But I'm not sure how the buffering part works. Does buffering also apply to local files? (file:///). And do you have some sample code on how to do this buffering with AudioKit?

If you want take a look at the current code: https://github.com/wavy-assistant/just_audio/tree/feature/new_ios_implementation (feedback is welcome and appreciated)

toonvanstrijp avatar Nov 15 '21 14:11 toonvanstrijp

Hi all, @SimoneBressan has just shared some significant work in PR #658 with an AVAudioEngine-based implementation in Swift if anyone would like to check it out. I will merge it into a new dev branch after I sort out a way for this Swift implementation to co-exist with the current Objective C implementation so that people can choose one over the other based on stability or feature set. In particular, #658 may not implement every feature but it does have the equalizer which is one of the nice things that is easier to do with AVAudioEngine, or indeed AudioKit which is something else to explore.

ryanheise avatar Feb 12 '22 07:02 ryanheise