webrtc-stats
webrtc-stats copied to clipboard
RTCCodecStats.clockRate - media sampling rate or the codec clock rate?
Spec for RTCCodecStats.clockRate says it is the "media sampling rate". However the parameter that is set is defined in RTCRtpCodec.clockRate as The codec clock rate expressed in Hertz, which on MDN is documented as "the rate at which the codec's RTP timestamp advances"
I'm largely ignorant of this, but looking around, it seems that these are different things. So I am wondering if RTCCodecStats.clockRate definition is correct, and is expected.
This is for MDN docs update - https://github.com/mdn/content/pull/32452/files#r1503650664