amazon-chime-sdk-js
amazon-chime-sdk-js copied to clipboard
Stereo mic feeds and audio transforms
What are you trying to do?
I'm trying to figure out, when using a combination of this chime sdk and the react sdk, how to enable stereo audio feeds. I've set up a StereoPanner node and have confirmed that it is set up to accept either a mono or stereo media stream (from a device or elsewhere) and output two channels. Ideally, I'd love to process microphone inputs client side such that attendees are spread across a stereo audio plane (i.e. left to right).
I understand the dangers of this:
- Devices may not support webaudio
- Devices may mix down to mono
- Stereo increases the bandwidth of calls
I've scoured the available documentation for information on enabling stereo and am finding it is undocumented.
How can the documentation be improved to help your use case?
It would be great to have some further documentation / examples on how to enable this feature! I understand that this likely involves a lot fiddly configuration with audio contexts but there is no indication of how sources are passed through chime and where it might be mixed down to mono.
What documentation have you looked at so far?
- https://aws.amazon.com/about-aws/whats-new/2021/12/amazon-chime-sdk-stereo-audio/
- https://aws.amazon.com/chime/chime-sdk/features/
- https://aws.github.io/amazon-chime-sdk-android/amazon-chime-sdk/com.amazonaws.services.chime.sdk.meetings.audiovideo.audio/-audio-mode/index.html
- https://docs.aws.amazon.com/pdfs/chime-sdk/latest/APIReference/chime-sdk-api.pdf#API_meeting-chime_AudioFeatures
- https://github.com/aws/amazon-chime-sdk-js/issues/806
- https://github.com/aws/amazon-chime-sdk-js/issues/632
- https://github.com/aws/amazon-chime-sdk-js/issues/632