davidliu
                                            davidliu
                                        
                                    Sharing the code is still meaningful, since it allows us to sanity check the code for mistakes, and more importantly gives us a repro to evaluate. Haven't seen this crash...
@filbabic Hmm, the JavaAudioDeviceModule handles everything by itself. No action is needed at all to start playback; audio is automatically played through the current audio output whenever a track is...
Again, no. There's nothing to do between receiving tracks and playing audio. Audio on Android is basically self-contained and should automatically playout if all the underlying sdp negotiation with offer...
Stumbled back onto this issue a couple years later: ``` sdp_offer_answer.cc: (line 2359): CreateAnswer: offer_to_receive_audio is not supported with Unified Plan semantics. Use the RtpTransceiver API instead. ``` This was...
@cloudwebrtc I think may want to expose groupId in `MediaDevice`. The iOS exposes the portType, which I think should be enough to handle this.
@cgarbacea as an intermediate solution to unblock, you can cross reference against the underlying `enumerateDevices()` function: ``` import 'package:flutter_webrtc/flutter_webrtc.dart' as rtc; var mediaDeviceInfos = await rtc.navigator.mediaDevices.enumerateDevices(); ``` Each MediaDeviceInfo object...
Is this a separate issue? [The audio input/output devices should properly set groupId.](https://github.com/flutter-webrtc/flutter-webrtc/blob/8a880156e0a02e9b4cfd1b39157ba94a6b36eaf1/common/darwin/Classes/FlutterRTCMediaStream.m#L630) Tested it and all the audio related devices gave their port type.
Fixed by #67.
Fixed by #67.
Can you turn on logging in your client and post the logs?