openwebrtc-ios-sdk
openwebrtc-ios-sdk copied to clipboard
Add mic muting to OpenWebRTCNativeHandler
As discussed in the mailing list: https://groups.google.com/forum/#!topic/openwebrtc/Spa9VPE58P0
It's pretty important in videoconferencing for participants to be able to mute their microphones when they aren't talking. In many cases this is necessary to avoid polluting the discussion with ambient noise. On iOS in particular, and maybe on other platforms as well there is no high-level function for doing this. Instead, OpenWebRTC should provide this ability in the NativeHandler.
I think we need to implement all such common functions like mute, speaker phone, headphones + device mic, headset.
Currently I think one can mute received calls but not mute what one is sending. It should not be too difficult to do though. Either using the same mechanism (insert the 'volume' GStreamer element somewhere in OwrMediaSource and expose the mute property) or using some iOS API is the osxaudiosrc GStreamer element and expose a mute property on it. I think I prefer the volume element approach as it's more generic.
Then the ability to mute sources will need to be hooked up in the JavaScript bridge code to benefit hybrid apps.
@adrianbg is this something you would be able to help out with?
I'm pretty pressed for time so I think I'll try to achieve this using AppRTC first. If that doesn't work out then I'll try to add it to OpenWebRTC.
I had some issues with WebRTC.org. The experience helped dissuade me that it'd be more reliable than OpenWebRTC. It also seems like a way less approachable codebase so I'm back on OpenWebRTC.
I'll take a look at OwrMediaSource and add a volume element. I'm not the best C programmer though so we'll see how it goes.
Btw, I tried AVAudioSession.setInputGain() on iOS and it had no effect.
@adrianbg ok, looking forward to hearing how things are progressing.
@adrianbg How am I able to build the openwebrtc framework together with your changes and then build my iOS Project?