Ant-Media-Server
Ant-Media-Server copied to clipboard
While using Broadcast Extension it is not possible to hear the audio of the other participant.
We are using the Broadcast Extension example with a p2p connection. We want to share the screen of an iPhone with another participant while maintaining an audio connection. The participant receives both video and audio without issue. The iOS user does not seem to play the audio of the other participant. This behaviour only happens with the broadcast extension and regular screen capturing (foreground) does not suffer from the same issue
ClientMOde: .join (for p2p)
Hi @Mohit-3196 ,
Thank you for the issue. It's an expected behavior.
I think I can provide a quick solution for this. Let's schedule the issue.
Yes just let me tell the solution.
Broadcast extension captures the system audio and/or mic audio. Here are some solutions for the different scenarios.
- If one just would like to send the mic audio but not to send system audio.
-
Set
showsMicrophoneButtonto true in WelcomeVideoControllerself.screenRecord.showsMicrophoneButton = true; -
Just open the SampleHandler.swift and go to
processSampleBuffer -
Comment out the line that sends the system audio under the
RPSampleBufferType.audioAppcase -
Enable the line that sends the mic audio under the
RPSampleBufferType.audioMiccaseAs a result it should look like something below
override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { switch sampleBufferType { case RPSampleBufferType.video: // Handle video sample buffer //NSLog("processSamplebuffer video"); if videoEnabled { self.client.deliverExternalVideo(sampleBuffer: sampleBuffer); } break case RPSampleBufferType.audioApp: // Handle audio sample buffer for app audio //NSLog("processSamplebuffer audio"); // if audioEnabled { // self.client.deliverExternalAudio(sampleBuffer: sampleBuffer); // } break case RPSampleBufferType.audioMic: // Handle audio sample buffer for mic audio. // You can choose // NSLog("processSamplebuffer audio mic"); if audioEnabled { self.client.deliverExternalAudio(sampleBuffer: sampleBuffer); } break @unknown default: // Handle other sample buffer types fatalError("Unknown type of sample buffer") } } -
Please pay attention: Don't forget to tap to enable Mic button when starting the Screen Record
- If one would like to send system audio and mic audio, we've a workaround solution. In this solution, user can send additional stream to the server that is audio only. In order to that, follow the instructions
- Open
SampleHandler.swiftin iOS SDK and create a new field as belowlet micAudioClient: AntMediaClient = AntMediaClient.init() - Go to the end of
broadcastStartedand add the following lineslet micAudioStreamId = "micAudio"; // give any stream id you want to have self.micAudioClient.delegate = self self.micAudioClient.setDebug(true) self.micAudioClient.setOptions(url: url as! String, streamId: micAudioStreamId as! String, token: token as? String ?? "", mode: AntMediaClientMode.publish, enableDataChannel: true, captureScreenEnabled: true); //disable video self.micAudioClient.setVideoEnable(enable: false); self.micAudioClient.setExternalVideoCapture(externalVideoCapture: false); self.micAudioClient.setExternalAudio(externalAudioEnabled: true) self.micAudioClient.initPeerConnection(); self.micAudioClient.start();
- Open
- Go to the end of
processSampleBufferand edit theRPSampleBufferType.audioMiccase and it should look like something below... case RPSampleBufferType.audioMic: // Handle audio sample buffer for mic audio. // You can choose // NSLog("processSamplebuffer audio mic"); if audioEnabled { self.micAudioClient.deliverExternalAudio(sampleBuffer: sampleBuffer); } break ... - Go to the
broadcastFinishedmethod and stop themicAudioClientas shown belowoverride func broadcastFinished() { self.client.stop(); self.micAudioClient.stop(); }
I've tested these things and It has worked for me. I'm attaching the edited SampleHandler.swift for your convenience. I hope it will help you and let me know if I can help further
Thanks for the suggested solutions. This however doesn't seem to solve the problem we have of receiving/playing audio when using the broadcasting extension.
That's our testing constellation: A peer-to-peer connection between a WEB-client <--> iOS client The audio issue only occurs in the direction of receiving audio on the iOS-client (web --> iOS). The other direction is working as expected: audio/mic from the iOS client is received on the WEB-client. And of course WEB <--> WEB is also working.
A comment to the suggested 3. solution (by the way, should there be a 2. solution?) This workaround won't work in our scenario of a peer-2-peer connections, because the AntMedia Server + SDKs won't allow more than 2 clients
Hi @doggomir , I see.
The audio issue only occurs in the direction of receiving audio on the iOS-client (web --> iOS).
What about mixing audio in the web side and sending the mixed audio. There is a support for mixing audio in web side. For instance, desktop audio + mic audio are mixed in this line https://github.com/ant-media/StreamApp/blob/master/src/main/webapp/js/media_manager.js#L373
A comment to the suggested 3. solution (by the way, should there be a 2. solution?) This workaround won't work in our scenario of a peer-2-peer connections, because the AntMedia Server + SDKs won't allow more than 2 clients
Yes, you're right. You already know that the quick workout solution is using the AMS to relay the audio/video for this scenario. Sorry for that.
Please let me know if we can help you with anything.
Regards, A. Oguz