usbx
usbx copied to clipboard
UVC
Is it possible with the help of this stack to implement interaction with Webcamera through UVC ? If so, how?
Unfortunately there is no online document for now. The basic UVC work flow is as follow:
- Initialize USBX with UVC
/*========================================================================*/
/*= Initialize USBX with UVC supported. */
/*========================================================================*/
/* Initialize USBX system. */
status = ux_system_initialize(memory_pointer, USBX_MEMORY_SIZE, usbx_cache_safe_memory, USBX_CACHE_SAFE_MEMORY_SIZE);
if (status != UX_SUCCESS)
error_handler();
/* Initialize USBX Host Stack. */
status = ux_host_stack_initialize(NULL);
if (status != UX_SUCCESS)
error_handler();
/* Register video class. */
status = ux_host_stack_class_register(_ux_system_host_class_video_name, _ux_host_class_video_entry);
if (status != UX_SUCCESS)
error_handler();
/* Register EHCI HCD. */
status = ux_host_stack_hcd_register(_ux_system_host_hcd_ehci_name, _ux_hcd_ehci_initialize, EHCI_BASE, 0x0);
if (status != UX_SUCCESS)
error_handler();
- Wait a UVC device connection
/*========================================================================*/
/*= Wait until UVC device is connected. */
/*========================================================================*/
/* Find the main video container. */
status = ux_host_stack_class_get(_ux_system_host_class_video_name, &host_class);
if (status != UX_SUCCESS)
error_handler();
/* We get the first instance of the video device. */
while (1)
{
status = ux_host_stack_class_instance_get(host_class, 0, (void **) &inst);
if (status == UX_SUCCESS)
break;
tx_thread_sleep(10);
}
/* We still need to wait for the video status to be live */
while (inst -> ux_host_class_video_state != UX_HOST_CLASS_INSTANCE_LIVE)
{
tx_thread_sleep(10);
}
video = inst;
- Setup parameters
/* Set video parameters to MJPEG, W x H resolution, .. fps. */
status = ux_host_class_video_frame_parameters_set(video,
UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG,
CAMERA_RESOLUTION_WIDTH,
CAMERA_RESOLUTION_HEIGHT,
TEN_FRAMES_PER_SECOND);
if (status != UX_SUCCESS)
error_handler();
- Start streaming
/*========================================================================*/
/*= Start UVC streaming. */
/*========================================================================*/
/* Start video transfer. */
status = ux_host_class_video_start(video);
if (status != UX_SUCCESS)
error_handler();
#if HIGH_BANDWIDTH_EHCI /* Driver HCD must support adding requests list. */
/* Build buffer list. */
for (i = 0; i < VIDEO_BUFFER_NB; i ++)
video_buffers[i] = video_buffer[i];
/* Issue transfer request list to start streaming. */
status = ux_host_class_video_transfer_buffers_add(video, video_buffers, VIDEO_BUFFER_NB);
if (status != UX_SUCCESS)
error_handler();
#elif NORMAL_BANDWIDTH_OHCI /* Driver adds request one by one. */
/* Allocate space for video buffer. */
for(buffer_index = 0; buffer_index < VIDEO_BUFFER_NB; buffer_index++)
{
/* Add buffer to the video device for video streaming data. */
status = ux_host_class_video_transfer_buffer_add(video,
video_buffer[buffer_index]);
if (status != UX_SUCCESS)
error_handler();
}
#endif
- Handle frame data and reuse frame buffers, assuming there is transfer done callback putting semaphore:
/* Set transfer callback (do before start transfer). */
ux_host_class_video_transfer_callback_set(video,
video_transfer_done);
/* Wait transfer done and re-use buffers. */
buffer_index = 0;
while (1)
{
/* Suspend here until a transfer callback is called. */
status = tx_semaphore_get(&data_received_semaphore, TX_WAIT_FOREVER);
if (status != UX_SUCCESS)
error_handler();
/* Received data. The callback function needs to obtain the actual
number of bytes received, so the application routine can read the
correct amount of data from the buffer. */
/* Application can now consume video data while the video device stores
the data into the other buffer. */
/* Add the buffer back for video transfer. */
status = ux_host_class_video_transfer_buffer_add(video,
video_buffer[buffer_index]);
if (status != UX_SUCCESS)
error_handler();
/* Increment the buffer_index, and wrap to zero if it exceeds the
maximum number of buffers. */
buffer_index = (buffer_index + 1);
if(buffer_index >= VIDEO_BUFFER_NB)
buffer_index = 0;
}
Thank you for the detailed answer !
Close this issue. Feel free to reopen if you have questions.
Are there anymore complete examples for this yet? I used the above code to get to the point where I can set parameters, but my device doesn't stream.
@bSquaredOlea There is no complete example project yet. For the stream, it actually depends on your hardware and host controller driver, since the ISO transfer is quite different to bulk and interrupt transfer. If your HCD is not ready for ISO transfer there is no stream.
You can try the steps to build a video example that enumerates and start streaming on a USB 2.0 high speed webcam (tested with "Microsoft LifeCam Studio(TM)"):
Get MIMXRT1060 Examples
- Download: https://github.com/azure-rtos/samples/releases/download/v6.1_rel/Azure_RTOS_6.1_MIMXRT1060_IAR_Samples_2021_11_03.zip
- Extract
- Confirm project
sample_usbx_host_mass_storage
works, we are modifying it
Modifications in sample_usbx_host_mass_storage.c
- Add include file
#include "ux_host_class_video.h"
- Add global variables
/* Define the number of buffers used in this demo. */
#define VIDEO_BUFFER_NB (UX_HOST_CLASS_VIDEO_TRANSFER_REQUEST_COUNT - 1)
UX_HOST_CLASS_VIDEO *video;
#pragma location="NonCacheable"
UCHAR video_buffer[UX_HOST_CLASS_VIDEO_TRANSFER_REQUEST_COUNT][3072];
/* This semaphore is used for the callback function to signal application thread
that video data is received and can be processed. */
TX_SEMAPHORE data_received_semaphore;
- Add instance check function (before
demo_thread_entry
)
static UINT demo_class_video_check()
{
UINT status;
UX_HOST_CLASS *host_class;
UX_HOST_CLASS_VIDEO *inst;
/* Find the main video container. */
status = ux_host_stack_class_get(_ux_system_host_class_video_name, &host_class);
if (status != UX_SUCCESS)
while(1); /* Error Halt */
/* We get the first instance of the video device. */
while (1)
{
status = ux_host_stack_class_instance_get(host_class, 0, (void **) &inst);
if (status == UX_SUCCESS)
break;
tx_thread_sleep(10);
}
/* We still need to wait for the video status to be live */
while (inst -> ux_host_class_video_state != UX_HOST_CLASS_INSTANCE_LIVE)
{
tx_thread_sleep(10);
}
video = inst;
return(UX_SUCCESS);
}
- Add video transfer done callback (before
demo_thread_entry
)
/* video data received callback function. */
static VOID video_transfer_done (UX_TRANSFER * transfer_request)
{
UINT status;
status = tx_semaphore_put(&data_received_semaphore);
if (status != UX_SUCCESS)
while(1); /* Error Halt. */
}
- Add class registration (in
demo_thread_entry
)
/* Register video class. */
status = ux_host_stack_class_register(_ux_system_host_class_video_name, _ux_host_class_video_entry);
if (status != UX_SUCCESS)
return;
- Replace the
while
loop code block indemo_thread_entry
UINT i;
UINT buffer_index;
UCHAR *video_buffers[VIDEO_BUFFER_NB];
/* Assume video points to a valid video instance. */
/* Create the semaphore for signaling video data received. */
status = tx_semaphore_create(&data_received_semaphore, "payload semaphore", 0);
if (status != UX_SUCCESS)
while(1); /* Error Halt. */
/* Wait for camera to be plugged in. */
demo_class_video_check();
/* Set transfer callback. */
ux_host_class_video_transfer_callback_set(video,
video_transfer_done);
/* Set video parameters to MJPEG, 640x480 resolution, 30fps. */
status = ux_host_class_video_frame_parameters_set(video,
UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG, 176, 144, 333333);
if (status != UX_SUCCESS)
while(1); /* Error Halt. */
/* Start video transfer. */
status = ux_host_class_video_start(video);
if (status != UX_SUCCESS)
while(1); /* Error Halt. */
/* Build buffer list. */
for (i = 0; i < VIDEO_BUFFER_NB; i ++)
video_buffers[i] = video_buffer[i];
/* Issue transfer request list to start streaming. */
ux_host_class_video_transfer_buffers_add(video, video_buffers, VIDEO_BUFFER_NB);
buffer_index = 0;
while (1)
{
/* Suspend here until a transfer callback is called. */
status = tx_semaphore_get(&data_received_semaphore, TX_WAIT_FOREVER);
if (status != UX_SUCCESS)
while(1); /* Error Halt. */
/* Received data. The callback function needs to obtain the actual
number of bytes received, so the application routine can read the
correct amount of data from the buffer. */
/* Application can now consume video data while the video device stores
the data into the other buffer. */
/* Add the buffer back for video transfer. */
status = ux_host_class_video_transfer_buffer_add(video,
video_buffer[buffer_index]);
if (status != UX_SUCCESS)
while(1); /* Error Halt. */
/* Increment the buffer_index, and wrap to zero if it exceeds the
maximum number of buffers. */
buffer_index = (buffer_index + 1);
if(buffer_index >= VIDEO_BUFFER_NB)
buffer_index = 0;
}
@bSquaredOlea : does the sample code help?
Thanks for your responses. I was able to get to a point where I started a transaction, but never got any video data from the device (I could get meta data from the device though). However, we are moving in a different direction now.
Thanks, Ben
On Tue, Feb 8, 2022 at 12:13 AM yuxin-azrtos @.***> wrote:
@bSquaredOlea https://github.com/bSquaredOlea : does the sample code help?
— Reply to this email directly, view it on GitHub https://github.com/azure-rtos/usbx/issues/4#issuecomment-1032248987, or unsubscribe https://github.com/notifications/unsubscribe-auth/AWZZBYBAHZJQA6BUBLRXRLLU2CYAZANCNFSM4U6EMFVA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.
You are receiving this because you were mentioned.Message ID: @.***>
@bSquaredOlea Thanks for sharing the progress. For the transaction, please note UVC transaction is based on isochronous endpoint, which works different against the control requests and bulk transfer, so only HCD with isochronous transfer support can get video data from the device (which has been done in EHCI HCD for 1060). If you are working on some other chip, your HCD still needs modification on isochronous transfer to make things right.
Hi, we can collect stream video through ethernet port to the PC. Is there a recommended PC side application that can collect from the Ethernet port and display the video? thanks!
No for the raw video stream.
Hi @xiaocq2001 , thanks for the comment. What about mpeg format? we can output mpeg format to PC. Can some of the webcam application be used to display the video collected through USBX? eg. webcamiod. If not, what are some of the limiting factors? thanks!
@xianghui-renesas, I'm not sure directly forwarded USB Video stream can be recognized by webcamiod, maybe you can try. I think the video stream must be rearrange/packaged by some web streaming protocol to allow PC application to play.
Hi @xiaocq2001, thanks! I had a quick try with webcamiod and found they are primarily looking for the USB Video streaming device and is unaware of the host video packet format. I have a specific question on the definition of TEN_FRAMES_PER_SECOND. How does it convert with fps? Appreciate any comment you can provide. Thanks! status = ux_host_class_video_frame_parameters_set(video, UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG, CAMERA_RESOLUTION_WIDTH, CAMERA_RESOLUTION_HEIGHT, TEN_FRAMES_PER_SECOND);
Yes. That's the framerate.
Hi @xiaocq2001, thanks! how is an input of 333333 converted 30fps, what is the unit of this argument in the API? thanks! /* Set video parameters to MJPEG, 640x480 resolution, 30fps. */ status = ux_host_class_video_frame_parameters_set(video, UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG, 176, 144, 333333); I can see 10000000/30=333333. It seems the unit of this argument is 10th of a microsecond. Could you explain?
Please refer to [Universal Serial Bus Device Class Definition for Video Devices: Frame Based Payload], where you can see frame intervals are in 100ns unit.
Thanks @xiaocq2001 for the reference information. We are trying to stream the video to PC through UDP port. The PC end application we are trying to use is VLC: https://docs.videolan.me/vlc-user/3.0/en/advanced/streaming/stream_over_udp.html Do you have any experience using host video class and VLC? One of the video format we identified using VLC is UX_HOST_CLASS_VIDEO_VS_FORMAT_H264, how do we set up the bandwidth? Also a general question is how are we defining the color channel format? We do not seem to see there is information on the stack for this.
Unfortunately, there is no H264 format demo for now, maybe you can trace some existing H264 camera for reference also there is H264 format spec on usb.org (Universal Serial Bus Device Class Definition for Video Devices: H.264 Payload).
General for USB bandwidth: done through changing different alternate settings.
Thanks @xiaocq2001 , could you explain how the color channel encoding is defined in the USBX host video stack? If we collect image using uncompressed format, what is the format of data in the video buffer collected? For example, with this configuration, how are the color coding and buffer data format defined? ux_host_class_video_frame_parameters_set(video, UX_HOST_CLASS_VIDEO_VS_FORMAT_UNCOMPRESSED, 160, 120, 333333); The max payload for this setting is 384 (which is identified from (ux_host_class_video_max_payload_get), can you help to explain the format of this data so we can repackage to send to the PC program?
Checking uncompressed format spec in https://usb.org/sites/default/files/USB_Video_Class_1_5.zip.
The supported pixel coding is as follow:
The format is reported by GUID:
The each payload is composed by a header and actual data, the header is like:
You can refer to the spec for more details.
Hi @xiaocq2001 , is there any application of uvc host running on stm32h7 target ??
Hi @xiaocq2001 , does the usbx video host stack support still image collection? Do you have an example code if it is supported.
@xianghui-renesas , still image collection is not supported currently.
Hi @xiaocq2001, I tried to piece together the packets collected in the MCU to display them using a feature in our e2studio IDE and found the packets are out of order in the packet buffers. I have 96 packet buffers. If the MCU is not providing the buffer fast enough for the frame rate, will the MCU start to skip packet? Do you have experience with this? thanks!
Hi @xiaocq2001, your example so far uses stream-based protocol, does the stack support frame-based protocol, do you have an example for the frame-based implementation? I think it may be easier to look at the raw image from the MCU buffer with frame-based implementation.
@xianghui-renesas , do you mean to see how a video frame is detected in USB packets? For motion jpeg, if you check the spec about payload header for each USB packet, there is EOF bit to indicate a frame end.
Hi @xiaocq2001 ,@yuxin-azrtos, I am working on USB host Isochrnous support, I have a one question that single transaction per microframe, I am able to see the video streaming and it's working fine but, multitransactions per miocroframe is not working, and i am no seeing any valid frames,
- I am using demo app as above one, I would like to know that, do i need to change application..? or could you please suggest how i can approach further on this.
I really appriciate your help on this.
Thanks Mahesh
@Mahesh-dev-avula I think the application is fine for multitransactions per microframe. Maybe you can check if multiple transactions per microframe is supported by your host controller, or the host controller driver needs modification to support it.
@xiaocq2001 , Thank you for responding on my query, Yes my host controller will support multi transactions per microframe,same hardware working on linux, my driver code also implemented based on linux reference(xhci driver). the difference i found that linux and azure rtos, in the linux they are preparing multiple bufferes at time and sending command to the hardware,but in RTOS from the application we are requesting only one buffere at time and it's working for single transaction per microframe, I would thinking that may be we need to prepare multiple bufferes at time for multi transactions. please correct me if I am wrong.
Thanks Mahesh