retina
retina copied to clipboard
Add ability to parse Annex B stream in FU-A
Summary
Add ability to break apart an Annex B stream sent in a FU-A. Fixes #68
Details
My V380 cam would send a FU-A after establishing RTSP connection. The FU-A was not conformant to spec.
FU-A (start) => [ sps header - sps - boundary - pps header - pps - boundary - idr header - idr ] # Annex B stream
FU-A (...) => [ sps header - idr ]
FU-A (end) => [ sps header - idr ]
Notice how all the frags have the same header (as they should be), but the start has an Annex B stream, meaning the last NAL picked from that packet is an IDR. This means the last NAL saved from first packet will be an IDR, but the next fragment will have... the header for SPS, but data for IDR, which is wrong.
It appears that the camera only does this for FU-A that has SPS & PPS. FU-As & single NAL units for other NAL types are conformant to spec.
I have only modified this logic for the FU-A flow. We can however use the start_rtp_nal, append_rtp_nal and end_rtp_nal to handle all NAL types.
Camera details
Name: V380 (It's a generic V380 outdoor camera)
Firmware: HwV380E31_WF8_PTZ_WIFI_20201218 (I had asked them for a firmware update file to enable RTSP support)
I added a bunch of comments, but this is a great first pass! Thanks for digging in; it's really nice to get contributions like this.
Thanks for taking the time out to review. I'll take a look at the comments and update the PR.
Should NalParser ignore continuous zeros? Currently benches/client.rs is failing because it creates a long 0x00 bytes data, which break_apart_nals fails to parse because it expects that three consecutive 0x00 are a boundary, and not part of a nal.
benches/client.rs is sending a FU-A (start) with all bytes being 0x00. Isn't this test data incorrect?
Apologies—I've been quite bogged down. Will look through your comments and updates tomorrow.
Apologies—I've been quite bogged down. Will look through your comments and updates tomorrow.
Wishing you good health. Please take your time to recover.
Closing in favor of #100