ggwave icon indicating copy to clipboard operation
ggwave copied to clipboard

Is there any way to build to iOS/Android module?

Open haozes opened this issue 3 years ago • 15 comments

Hi,The demo show this code is so great. I am not familiar with the C building to iOS/android sdk, there will be more helpful for a example to build to mobile platform.

Thank you very much.

haozes avatar Dec 21 '20 09:12 haozes

Today I added a C interface to the library and implemented Python bindings. I think this C interface can be used for iOS development. A simple example to demonstrate how to use would be great though.

JNI bindings are necessary for Android development - will try to implement those along the way.

Would be awesome if someone is willing to help with this - my iOS/Android experience is very limited.

ggerganov avatar Jan 17 '21 20:01 ggerganov

How is the Waver Android app using your C code? I assumed that you had already done the JNI to build Waver.

commonsguy avatar Jan 17 '21 20:01 commonsguy

In Waver I used https://github.com/pthom/hello_imgui for the boilerplate and only had to plugin my C++ code. It uses SDL for audio capture/playback and the C++ SDL functions are automatically doing what they are supposed to.

For ggwave I want to make a generic SDK that is not coupled with SDL. Essentially, provide the same interface as the new C functions for iOS and Android. The developer can then plug it in any backend they would like - not only SDL.

ggerganov avatar Jan 17 '21 21:01 ggerganov

I can take a shot at the Android side.

I recommend that I create a separate project as a "spike solution". We can then determine what of that you like and how you would want to migrate those pieces into this GitHub repo (or some other repo of yours).

Let me know if that is OK or if you would prefer some other approach — thanks!

commonsguy avatar Jan 23 '21 19:01 commonsguy

Sounds good. Ideally I would like to have a simple test for each platform/language which simply encodes and decodes some data without using any audio engine. I've currently made such tests for C, C++ and Python in tests folder. Then, a simple example where actual audio is captured/played in the examples folder. The example can use any audio engine available.

When you are ready with the Android project we can look into incorporating it in this repo.

Let me know if you think the C API can be improved in any way. I am planning to add a few more functions to it. My next goal is to implement Javascript bindings with it using Emscripten.

ggerganov avatar Jan 23 '21 20:01 ggerganov

Let me know if you think the C API can be improved in any way

To be completely honest... it could use documentation. For example:

  • For ggwave_decode():
    • What is the data format for dataBuffer?
    • How big is outputBuffer supposed to be?
  • For ggwave_encode():
    • What is the range of valid values for volume?
    • Is the idea that we should call this twice, once with query set to 1, allocate outputBuffer based upon that, then call it again with query set to 0?
    • What is the data format for outputBuffer? (I'm guessing it is the same as the format for dataBuffer in ggwave_decode())

My apologies if some of this should be obvious. I am very good with Android, but my serious C/C++ work is 20 years in my past. I can still do some light JNI work, but I am not in position to attempt to try to determine this sort of information solely from reading your C++ source code.

commonsguy avatar Jan 24 '21 22:01 commonsguy

Absolutely agree - I need to work on that. I added some more comments to ggwave.h:

https://github.com/ggerganov/ggwave/blob/master/include/ggwave/ggwave.h#L21-L195

I hope these can answer your questions. While writing this, I realized I have named the arguments of ggwave_encode() and ggwave_decode() very poorly:

  • In ggwave_encode() the dataBuffer is the data that you would like to encode into audio (for example, some text like "hello"), while the outputBuffer is the buffer that contains the generated audio waveform.
  • At the same time, in ggwave_decode() the argument dataBuffer is the input audio waveform that you captured and would like to decode, while the outputBuffer is the decoded data (i.e. the "hello" message)

So the naming of the 2 things is reversed between the two functions, which is kind of confusing. I added "todo" comments to the header file noting that this will be changed. For now, I will leave it like this and will fix it at a later stage.

What is the data format for outputBuffer? (I'm guessing it is the same as the format for dataBuffer in ggwave_decode())

Correct, but not always the case. Usually, the user will be capturing and playing samples in the same data format (for example 32-bit floats). In that case - yes, they are the same format. However, one can for example capture the audio in 32-bit float format, but decide to play it at 16-bit integer format. In that case, the two formats are different.

To summarize:

  • The format of dataBuffer in ggwave_decode() corresponds to sampleFormatInp parameter used during ggwave_init()
  • The format of outputBuffer in ggwave_encode() corresponds to sampleFormatOut parameter used during ggwave_init()

ggerganov avatar Jan 25 '21 19:01 ggerganov

That's a huge help — thanks!

In terms of the content format, though... is this raw PCM? Some of your code is centered around WAV files, which suggests that the content format is PCM, but I may be misinterpreting what I'm seeing in that code.

commonsguy avatar Jan 25 '21 23:01 commonsguy

Yes - we generate and process only raw PCM. Whenever I used the word waveform, I actually mean PCM :)

ggerganov avatar Jan 26 '21 05:01 ggerganov

OK, after having poked at this a while, I have concluded that I lack enough JNI ability to do this well. :frowning_face: I am an Android generalist, and both JNI and media are on the edges of my areas of expertise. Combining the two is not going well.

I will see if I can find some other way to get you the help that you need.

commonsguy avatar Feb 01 '21 21:02 commonsguy

No worries, I really appreciate the effort! Thanks!

ggerganov avatar Feb 02 '21 06:02 ggerganov

I'm back! :grin:

I still needed a proof of concept for using GGWave in an Android app. So, I put together this demo that uses your JavaScript implementation in an Android WebView and exposes an API for ordinary Android activities to send/receive data. It works surprisingly well, though this approach is far from the best way to add GGWave to Android. It's just the easiest solution for me for the short term. I have confirmed that it interoperates with Waver, as expected.

If you have any questions about that project or things that would help you if I added there, let me know! I'll be making minor tweaks to it along the way.

commonsguy avatar Feb 27 '21 21:02 commonsguy

Awesome!

My initial attempts to make a mobile demo were to use the JS/Wasm port in a Cordova project, but I couldn't figure out all the details.

This looks really well done. I will take a more detailed look these days and try to run it myself.

ggerganov avatar Feb 27 '21 21:02 ggerganov

I have finally implemented an Objective-C iOS app using ggwave:

https://github.com/ggerganov/ggwave-objc

ggerganov avatar Apr 24 '21 11:04 ggerganov

Minimal Android example in Java/C++ is now also available:

https://github.com/ggerganov/ggwave-java

ggerganov avatar Apr 27 '21 18:04 ggerganov