jetson-utils icon indicating copy to clipboard operation
jetson-utils copied to clipboard

gstBufferManager -- failed to map EGLImage from NVMM buffer

Open Kajatin opened this issue 2 years ago • 14 comments

Hello,

I have updated my jetson-utils clone today with the latest changes in main and videoSource started giving me problems.

It seems like it can't get the frame from my RTSP camera. I'm running on a Jetson Xavier NX dev board. Please see the attached logs.

[gstreamer] initialized gstreamer, version 1.14.5.0
[gstreamer] gstDecoder -- creating decoder for admin:[email protected]
Opening in BLOCKING MODE
Opening in BLOCKING MODE 
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
[gstreamer] gstDecoder -- discovered video resolution: 2560x1920  (framerate 0.000000 Hz)
[gstreamer] gstDecoder -- discovered video caps:  video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)5.1, profile=(string)high, width=(int)2560, height=(int)1920, framerate=(fraction)0/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
[gstreamer] gstDecoder -- pipeline string:
[gstreamer] rtspsrc location=rtsp://admin:[email protected]:554//h264Preview_01_main latency=2000 ! queue ! rtph264depay ! h264parse ! omxh264dec ! nvvidconv flip-method=6 ! video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920, format=(string)NV12 ! appsink name=mysink
[video]  created gstDecoder from rtsp://admin:[email protected]:554//h264Preview_01_main
------------------------------------------------
gstDecoder video options:
------------------------------------------------
  -- URI: rtsp://admin:[email protected]:554//h264Preview_01_main
     - protocol:  rtsp
     - location:  admin:[email protected]
     - port:      554
  -- deviceType: ip
  -- ioType:     input
  -- codec:      h264
  -- width:      2560
  -- height:     1920
  -- frameRate:  0.000000
  -- bitRate:    0
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: vertical
  -- loop:       0
  -- rtspLatency 2000
------------------------------------------------
[gstreamer] opening gstDecoder for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> nvvconv0
[gstreamer] gstreamer changed state from NULL to READY ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from NULL to READY ==> h264parse1
[gstreamer] gstreamer changed state from NULL to READY ==> rtph264depay1
[gstreamer] gstreamer changed state from NULL to READY ==> queue0
[gstreamer] gstreamer changed state from NULL to READY ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> nvvconv0
[gstreamer] gstreamer changed state from READY to PAUSED ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from READY to PAUSED ==> h264parse1
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtph264depay1
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> queue0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtspsrc0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> nvvconv0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> omxh264dec-omxh264dec0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> h264parse1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtph264depay1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> queue0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> manager
[gstreamer] gstreamer changed state from READY to PAUSED ==> manager
[gstreamer] gstreamer changed state from NULL to READY ==> rtpssrcdemux2
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpssrcdemux2
[gstreamer] gstreamer changed state from NULL to READY ==> rtpsession2
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpsession2
[gstreamer] gstreamer changed state from NULL to READY ==> funnel4
[gstreamer] gstreamer changed state from READY to PAUSED ==> funnel4
[gstreamer] gstreamer changed state from NULL to READY ==> funnel5
[gstreamer] gstreamer changed state from READY to PAUSED ==> funnel5
[gstreamer] gstreamer changed state from NULL to READY ==> rtpstorage2
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpstorage2
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> rtpssrcdemux3
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpssrcdemux3
[gstreamer] gstreamer changed state from NULL to READY ==> rtpsession3
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpsession3
[gstreamer] gstreamer changed state from NULL to READY ==> funnel6
[gstreamer] gstreamer changed state from READY to PAUSED ==> funnel6
[gstreamer] gstreamer changed state from NULL to READY ==> funnel7
[gstreamer] gstreamer changed state from READY to PAUSED ==> funnel7
[gstreamer] gstreamer changed state from NULL to READY ==> rtpstorage3
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpstorage3
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer changed state from NULL to READY ==> udpsink4
[gstreamer] gstreamer changed state from READY to PAUSED ==> udpsink4
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> udpsink4
[gstreamer] gstreamer changed state from NULL to READY ==> fakesrc2
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> fakesrc2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> fakesrc2
[gstreamer] gstreamer changed state from NULL to READY ==> udpsink6
[gstreamer] gstreamer changed state from READY to PAUSED ==> udpsink6
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> udpsink6
[gstreamer] gstreamer changed state from NULL to READY ==> fakesrc3
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> fakesrc3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> fakesrc3
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpssrcdemux3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpstorage3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpsession3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> funnel6
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> funnel7
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpssrcdemux2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpstorage2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpsession2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> funnel4
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> funnel5
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> manager
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> udpsrc5
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> udpsrc5
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> udpsrc6
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> udpsrc6
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> udpsrc7
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> udpsrc7
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> udpsrc8
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> udpsrc8
[gstreamer] gstreamer message progress ==> rtspsrc0
[gstreamer] gstreamer message element ==> rtpsession3
[gstreamer] gstreamer message element ==> rtpsession2
[gstreamer] gstreamer changed state from NULL to READY ==> rtpptdemux2
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpptdemux2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpptdemux2
[gstreamer] gstreamer changed state from NULL to READY ==> rtpjitterbuffer2
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpjitterbuffer2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpjitterbuffer2
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from NULL to READY ==> rtpptdemux3
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpptdemux3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpptdemux3
[gstreamer] gstreamer changed state from NULL to READY ==> rtpjitterbuffer3
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtpjitterbuffer3
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtpjitterbuffer3
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstDecoder -- failed to retrieve next image buffer

(Stream:3712): GStreamer-CRITICAL **: 10:54:04.339: gst_caps_is_empty: assertion 'GST_IS_CAPS (caps)' failed

(Stream:3712): GStreamer-CRITICAL **: 10:54:04.339: gst_caps_truncate: assertion 'GST_IS_CAPS (caps)' failed

(Stream:3712): GStreamer-CRITICAL **: 10:54:04.339: gst_caps_fixate: assertion 'GST_IS_CAPS (caps)' failed

(Stream:3712): GStreamer-CRITICAL **: 10:54:04.339: gst_caps_get_structure: assertion 'GST_IS_CAPS (caps)' failed

(Stream:3712): GStreamer-CRITICAL **: 10:54:04.339: gst_structure_get_string: assertion 'structure != NULL' failed

(Stream:3712): GStreamer-CRITICAL **: 10:54:04.340: gst_mini_object_unref: assertion 'mini_object != NULL' failed
NvMMLiteOpen : Block : BlockType = 261 
NVMEDIA: Reading vendor.tegra.display-size : status: 6 
NvMMLiteBlockCreate : Block : BlockType = 261 
Allocating new output: 2560x1920 (x 16), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3605: Send OMX_EventPortSettingsChanged: nFrameWidth = 2560, nFrameHeight = 1920 
[gstreamer] gstDecoder -- onPreroll()
[gstreamer] gstreamer message stream-start ==> pipeline0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer mysink taglist, video-codec=(string)"H.264\ \(High\ Profile\)";
[gstreamer] gstDecoder -- failed to retrieve next image buffer
[gstreamer] gstDecoder -- failed to retrieve next image buffer
[gstreamer] gstBufferManager recieve caps:  video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)NV12
[gstreamer] gstBufferManager -- recieved first frame, codec=h264 format=nv12 width=2560 height=1920 size=1008
[gstreamer] gstBufferManager -- recieved NVMM memory
NvEGLImageFromFd: No EGLDisplay to create EGLImage
[gstreamer] gstBufferManager -- failed to map EGLImage from NVMM buffer
[gstreamer] gstDecoder -- failed to handle incoming buffer
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
[gstreamer] gstBufferManager recieve caps:  video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)NV12
[gstreamer] gstBufferManager -- recieved first frame, codec=h264 format=nv12 width=2560 height=1920 size=1008
[gstreamer] gstBufferManager -- recieved NVMM memory
NvEGLImageFromFd: No EGLDisplay to create EGLImage
[gstreamer] gstBufferManager -- failed to map EGLImage from NVMM buffer
[gstreamer] gstDecoder -- failed to handle incoming buffer
[gstreamer] gstBufferManager recieve caps:  video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)NV12
[gstreamer] gstBufferManager -- recieved first frame, codec=h264 format=nv12 width=2560 height=1920 size=1008
[gstreamer] gstBufferManager -- recieved NVMM memory
NvEGLImageFromFd: No EGLDisplay to create EGLImage
[gstreamer] gstBufferManager -- failed to map EGLImage from NVMM buffer
[gstreamer] gstDecoder -- failed to handle incoming buffer
[gstreamer] gstBufferManager recieve caps:  video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)NV12
[gstreamer] gstBufferManager -- recieved first frame, codec=h264 format=nv12 width=2560 height=1920 size=1008
[gstreamer] gstBufferManager -- recieved NVMM memory
NvEGLImageFromFd: No EGLDisplay to create EGLImage
[gstreamer] gstBufferManager -- failed to map EGLImage from NVMM buffer
[gstreamer] gstDecoder -- failed to handle incoming buffer
[gstreamer] gstBufferManager -- map buffer size was less than max size (1008 vs 1015)
[gstreamer] gstBufferManager recieve caps:  video/x-raw(memory:NVMM), width=(int)2560, height=(int)1920, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)0/1, format=(string)NV12
[gstreamer] gstBufferManager -- recieved first frame, codec=h264 format=nv12 width=2560 height=1920 size=1015
[gstreamer] gstBufferManager -- recieved NVMM memory
NvEGLImageFromFd: No EGLDisplay to create EGLImage
[gstreamer] gstBufferManager -- failed to map EGLImage from NVMM buffer
[gstreamer] gstDecoder -- failed to handle incoming buffer

Kajatin avatar Dec 03 '21 09:12 Kajatin

Note: If I disable ENABLE_NVMM, it works. I wonder if I could potentially get better performance with nvmm? Otherwise I'll just build with this flag off, and then it all works just fine.

Kajatin avatar Dec 03 '21 10:12 Kajatin

Hi @Kajatin, yes you typically can get better performance with using NVMM, but it seems in some circumstances it does not work (I am still trying to find out why). So for now please continue building it with ENABLE_NVMM=off, thank you.

dusty-nv avatar Dec 03 '21 15:12 dusty-nv

Thanks for the update @dusty-nv . I will continue building it with the flag off.

Kajatin avatar Dec 03 '21 16:12 Kajatin

Stupid Question: How do I set the parameter ENABLE_NVMM=off? I'm building python applucations and am nit sure how to use building parameters. Do I have to build this repository on my jetson Nano with the parameters in the CMAKE file?

edrethardo avatar Jan 17 '22 12:01 edrethardo

When you build this project using cmake, you can pass in settings to the executable like: cmake -DENABLE_NVMM=off [rest_of_the_command].

Kajatin avatar Jan 17 '22 16:01 Kajatin

I built with ENABLE_NVMM enable and faced the same problem when I used Jetson Nano in headless mode. If the LightDM GUI is enabled , everything works fine. Hope that it will help you solve this issue.

phongphuhanam avatar Mar 21 '22 10:03 phongphuhanam

i have my jetson kit in remote dekstop connection may i know how to enable lightdm gui

Vinu-Suhas avatar May 28 '22 10:05 Vinu-Suhas

i have my jetson kit in remote dekstop connection may i know how to enable lightdm gui

Hi @Vinumax969, see here: https://github.com/dusty-nv/jetson-inference/blob/master/docs/pytorch-transfer-learning.md#disabling-the-desktop-gui

dusty-nv avatar May 31 '22 13:05 dusty-nv

Hi everyone, may i know how can i disable nvmm? I didn't understand from the conversation. Is there any tutorial available?

adnan4502 avatar Dec 30 '22 12:12 adnan4502

Hi @adnan4502, rebuild it with the -DENABLE_NVMM=off cmake option:

cd jetson-inference/build
cmake -DENABLE_NVMM=off ../
make
sudo make install

dusty-nv avatar Dec 30 '22 15:12 dusty-nv

@dusty-nv I found your comment and did the same..Now I am facing another error. **[cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/cuda/cudaYUV-NV12.cu:154 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/cuda/cudaColorspace.cpp:42 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/codec/gstBufferManager.cpp:435 [gstreamer] gstBufferManager -- unsupported image format (rgb8) [gstreamer] supported formats are: [gstreamer] * rgb8 [gstreamer] * rgba8 [gstreamer] * rgb32f [gstreamer] * rgba32f [gstreamer] gstDecoder -- failed to retrieve next image buffer segnet: failed to capture video frame [gstreamer] gstDecoder -- end of stream (EOS) [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/display/glTexture.cpp:360

[TRT] ------------------------------------------------ [TRT] Timing Report networks/FCN-ResNet18-MHP-512x320/fcn_resnet18.onnx [TRT] ------------------------------------------------ [TRT] Pre-Process CPU 0.06500ms CUDA 1.94229ms [TRT] Network CPU 34.22566ms CUDA 29.76141ms [TRT] Post-Process CPU 0.76584ms CUDA 0.96297ms [TRT] Visualize CPU 0.04938ms CUDA 6.08677ms [TRT] Total CPU 35.10588ms CUDA 38.75344ms [TRT] ------------------------------------------------

[gstreamer] gstDecoder -- end of stream (EOS) has been reached, stream has been closed segnet: shutting down... [gstreamer] gstDecoder -- stopping pipeline, transitioning to GST_STATE_NULL [gstreamer] gstDecoder -- pipeline stopped segnet: shutdown complete.**

A blank screen appears and wents away after 10-15 seconds

i tried again with my camera with video-viewer /dev/video0 then the output is: problem

The video viewer is blank and the errors can be seen in the snap. The error i get is: [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/display/glTexture.cpp:360 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/cuda/cudaYUV-YV12.cu:119 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/cuda/cudaColorspace.cpp:53 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/codec/gstBufferManager.cpp:435 [gstreamer] gstBufferManager -- unsupported image format (rgb8) [gstreamer] supported formats are: [gstreamer] * rgb8 [gstreamer] * rgba8 [gstreamer] * rgb32f [gstreamer] * rgba32f [gstreamer] gstDecoder -- failed to retrieve next image buffer video-viewer: failed to capture video frame

What can i do to solve this problem? please help me

adnan4502 avatar Dec 30 '22 15:12 adnan4502

@adnan4502 can you copy & paste the console terminal log from the beginning of running video-viewer?

Also I don't believe the OpenGL display will work over SSH / X11-tunnelling because it uses CUDA/OpenGL interoperability. Instead please use RTP to view the video remotely: https://github.com/dusty-nv/jetson-inference/blob/master/docs/aux-streaming.md#transmitting-rtp

dusty-nv avatar Dec 30 '22 15:12 dusty-nv

nano-jetson@nanojetson-desktop:~$ cd jetson-inference/ nano-jetson@nanojetson-desktop:~/jetson-inference$ video-viewer /dev/video0 [gstreamer] initialized gstreamer, version 1.14.5.0 [gstreamer] gstCamera -- attempting to create device v4l2:///dev/video0 [gstreamer] gstCamera -- found v4l2 device: Webcam C170 [gstreamer] v4l2-proplist, device.path=(string)/dev/video0, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)"Webcam\ C170", v4l2.device.bus_info=(string)usb-70090000.xusb-2.2, v4l2.device.version=(uint)264703, v4l2.device.capabilities=(uint)2216689665, v4l2.device.device_caps=(uint)69206017; [gstreamer] gstCamera -- found 20 caps for v4l2 device /dev/video0 [gstreamer] [0] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [1] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [2] video/x-raw, format=(string)YUY2, width=(int)544, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [3] video/x-raw, format=(string)YUY2, width=(int)432, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [4] video/x-raw, format=(string)YUY2, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [5] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [6] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)176, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [7] video/x-raw, format=(string)YUY2, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [8] video/x-raw, format=(string)YUY2, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [9] image/jpeg, width=(int)1024, height=(int)768, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [10] image/jpeg, width=(int)800, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [11] image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [12] image/jpeg, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [13] image/jpeg, width=(int)544, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [14] image/jpeg, width=(int)432, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [15] image/jpeg, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [16] image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [17] image/jpeg, width=(int)320, height=(int)176, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [18] image/jpeg, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] [19] image/jpeg, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; [gstreamer] gstCamera -- selected device profile: codec=mjpeg format=unknown width=1024 height=768 [gstreamer] gstCamera pipeline string: [gstreamer] v4l2src device=/dev/video0 do-timestamp=true ! image/jpeg, width=(int)1024, height=(int)768 ! jpegdec ! video/x-raw ! appsink name=mysink [gstreamer] gstCamera successfully created device v4l2:///dev/video0 [video] created gstCamera from v4l2:///dev/video0

gstCamera video options:

-- URI: v4l2:///dev/video0 - protocol: v4l2 - location: /dev/video0 -- deviceType: v4l2 -- ioType: input -- codec: mjpeg -- width: 1024 -- height: 768 -- frameRate: 30.000000 -- bitRate: 0 -- numBuffers: 4 -- zeroCopy: true -- flipMethod: none -- loop: 0 -- rtspLatency 2000

[OpenGL] glDisplay -- X screen 0 resolution: 1920x1017 [OpenGL] glDisplay -- X window resolution: 1920x1017 [OpenGL] glDisplay -- display device initialized (1920x1017) [video] created glDisplay from display://0

glDisplay video options:

-- URI: display://0 - protocol: display - location: 0 -- deviceType: display -- ioType: output -- codec: raw -- width: 1920 -- height: 1017 -- frameRate: 0.000000 -- bitRate: 0 -- numBuffers: 4 -- zeroCopy: true -- flipMethod: none -- loop: 0 -- rtspLatency 2000

[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING [gstreamer] gstreamer changed state from NULL to READY ==> mysink [gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1 [gstreamer] gstreamer changed state from NULL to READY ==> jpegdec0 [gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0 [gstreamer] gstreamer changed state from NULL to READY ==> v4l2src0 [gstreamer] gstreamer changed state from NULL to READY ==> pipeline0 [gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1 [gstreamer] gstreamer changed state from READY to PAUSED ==> jpegdec0 [gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0 [gstreamer] gstreamer stream status CREATE ==> src [gstreamer] gstreamer changed state from READY to PAUSED ==> v4l2src0 [gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0 [gstreamer] gstreamer message new-clock ==> pipeline0 [gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1 [gstreamer] gstreamer changed state from PAUSED to PLAYING ==> jpegdec0 [gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0 [gstreamer] gstreamer changed state from PAUSED to PLAYING ==> v4l2src0 [gstreamer] gstreamer stream status ENTER ==> src [gstreamer] gstreamer message stream-start ==> pipeline0 [gstreamer] gstCamera -- onPreroll [gstreamer] gstBufferManager -- map buffer size was less than max size (1179648 vs 1179655) [gstreamer] gstBufferManager recieve caps: video/x-raw, format=(string)I420, width=(int)1024, height=(int)768, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)1:4:0:0, framerate=(fraction)30/1 [gstreamer] gstBufferManager -- recieved first frame, codec=mjpeg format=i420 width=1024 height=768 size=1179655 RingBuffer -- allocated 4 buffers (1179655 bytes each, 4718620 bytes total) RingBuffer -- allocated 4 buffers (8 bytes each, 32 bytes total) [gstreamer] gstreamer changed state from READY to PAUSED ==> mysink [gstreamer] gstreamer message async-done ==> pipeline0 [gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink [gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0 RingBuffer -- allocated 4 buffers (2359296 bytes each, 9437184 bytes total) video-viewer: captured 1 frames (1024 x 768) [OpenGL] glDisplay -- set the window size to 1024x768 [OpenGL] creating 1024x768 texture (GL_RGB8 format, 2359296 bytes) [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/display/glTexture.cpp:360 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/cuda/cudaYUV-YV12.cu:119 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/cuda/cudaColorspace.cpp:53 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/codec/gstBufferManager.cpp:435 [gstreamer] gstBufferManager -- unsupported image format (rgb8) [gstreamer] supported formats are: [gstreamer] * rgb8 [gstreamer] * rgba8 [gstreamer] * rgb32f [gstreamer] * rgba32f [gstreamer] gstDecoder -- failed to retrieve next image buffer video-viewer: failed to capture video frame video-viewer: captured 2 frames (1024 x 768) [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/display/glTexture.cpp:360 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/cuda/cudaYUV-YV12.cu:119 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/cuda/cudaColorspace.cpp:53 [cuda] unknown error (error 999) (hex 0x3E7) [cuda] /home/nano-jetson/jetson-inference/utils/codec/gstBufferManager.cpp:435 [gstreamer] gstBufferManager -- unsupported image format (rgb8) [gstreamer] supported formats are: [gstreamer] * rgb8 [gstreamer] * rgba8 [gstreamer] * rgb32f [gstreamer] * rgba32f [gstreamer] gstDecoder -- failed to retrieve next image buffer video-viewer: failed to capture video frame @dusty-nv

adnan4502 avatar Dec 30 '22 15:12 adnan4502

hey again, I tried but failed sadly at first...but then i tried it with opencv in a python program and it worked.

adnan4502 avatar Dec 31 '22 18:12 adnan4502