camera-streamer icon indicating copy to clipboard operation
camera-streamer copied to clipboard

Feature Request: High Resolution Snapshots

Open michaelmyc opened this issue 1 year ago • 5 comments

I see that the current snapshot takes the image in the video buffer and saves it. This is efficient and doesn't need to adjust capture configuration while camera-streamer is running. This is great if you want a consistently smooth video stream, and you don't really care that the snapshot is of the same low resolution as the stream. However, I'm specifically looking at the use case of making time-lapse with camera-streamer, which cares about the output image (snapshot) resolution.

Usually, we don't need higher-quality and guaranteed-smooth streams, but we do want time-lapse images that are as high-quality as possible. This means hiccups (frame drops) in the stream while the time-lapse image is taken is acceptable. So I think maybe a dual-buffer mode can be helpful: one buffer will handle the lower-resolution video streaming while the other buffer will provide higher-resolution image capturing.

I'm proposing the flow could be:

  1. "/high-res-snapshot" endpoint triggered
  2. low-res buffer drops
  3. load high-res buffer
  4. Take image and return to endpoint
  5. high-res buffer drops
  6. continue with low-res buffer

A few potential drawbacks to this approach would be:

  1. Hiccups in the low-res buffer when the high-res buffer is active
  2. Snapshots would have a short delay
  3. Added code complexity of juggling between 2 buffers

Fairly new to camera-streamer, so still trying to figure things out. Not sure how feasible this idea is or if something already exists to achieve high resolution snapshots while maintaining a lower-resolution video feed.

michaelmyc avatar Apr 09 '23 13:04 michaelmyc

The app already can output high resolution for snapshot than stream. It is limited to 1920x1920. You can specify lower reoslution for the stream or video feed.

On Sun, 9 Apr 2023 at 15:32, Michael Mao @.***> wrote:

I see that the current snapshot takes the image in the video buffer and saves it. This is efficient and doesn't need to adjust capture configuration while camera-streamer is running. This is great if you want a consistently smooth video stream, and you don't really care that the snapshot is of the same low resolution as the stream. However, I'm specifically looking at the use case of making time-lapse with camera-streamer, which cares about the output image (snapshot) resolution.

Usually, we don't need higher-quality and guaranteed-smooth streams, but we do want time-lapse images that are as high-quality as possible. This means hiccups (frame drops) in the stream while the time-lapse image is taken is acceptable. So I think maybe a dual-buffer mode can be helpful: one buffer will handle the lower-resolution video streaming while the other buffer will provide higher-resolution image capturing.

I'm proposing the flow could be:

  1. "/high-res-snapshot" endpoint triggered
  2. low-res buffer drops
  3. load high-res buffer
  4. Take image and return to endpoint
  5. high-res buffer drops
  6. continue with low-res buffer

A few potential drawbacks to this approach would be:

  1. Hiccups in the low-res buffer when the high-res buffer is active
  2. Snapshots would have a short delay
  3. Added code complexity of juggling between 2 buffers

Fairly new to camera-streamer, so still trying to figure things out. Not sure how feasible this idea is or if something already exists to achieve high resolution snapshots while maintaining a lower-resolution video feed.

— Reply to this email directly, view it on GitHub https://github.com/ayufan/camera-streamer/issues/52, or unsubscribe https://github.com/notifications/unsubscribe-auth/AASOSQJGZMUJZG7LNSAA4MTXAK26HANCNFSM6AAAAAAWYDY25A . You are receiving this because you are subscribed to this thread.Message ID: @.***>

ayufan avatar Apr 09 '23 13:04 ayufan

Ah, I see. Why did we limit it to 1920x1920? There are quite a few cameras that can output images much larger than that.

michaelmyc avatar Apr 09 '23 13:04 michaelmyc

They do. But this is limitation of raspberry pi hardware encoder for jpeg.

On Sun, 9 Apr 2023 at 15:40, Michael Mao @.***> wrote:

Ah, I see. Why did we limit it to 1920x1920? There are quite a few cameras that can output images much larger than that.

— Reply to this email directly, view it on GitHub https://github.com/ayufan/camera-streamer/issues/52#issuecomment-1501131116, or unsubscribe https://github.com/notifications/unsubscribe-auth/AASOSQO4KRESREMMIGJ3RI3XAK33BANCNFSM6AAAAAAWYDY25A . You are receiving this because you commented.Message ID: @.***>

ayufan avatar Apr 09 '23 13:04 ayufan

Hmm. I can use libcamera-jpeg to get images larger than 1920x1920, though.

michaelmyc avatar Apr 09 '23 13:04 michaelmyc

Yes, but it is encoded in software. High cpu usage. And the quality difference is not that big.

The fact is also that on up to 1920p res you can also fully reencode into h264 with hardware.

On Sun, 9 Apr 2023 at 15:43, Michael Mao @.***> wrote:

Hmm. I can use libcamera-jpeg to get images larger than 1920x1920, though.

— Reply to this email directly, view it on GitHub https://github.com/ayufan/camera-streamer/issues/52#issuecomment-1501131661, or unsubscribe https://github.com/notifications/unsubscribe-auth/AASOSQMN6OZ54PUXCCRDEEDXAK4IPANCNFSM6AAAAAAWYDY25A . You are receiving this because you commented.Message ID: @.***>

ayufan avatar Apr 09 '23 13:04 ayufan