jetson-inference icon indicating copy to clipboard operation
jetson-inference copied to clipboard

RTSP Video Output Fails on Jetpack 6

Open hunterwn opened this issue 1 year ago • 13 comments

I ran the following Python script after compiling and installing jetson-utils on my Orin devkit:

from jetson_utils import videoSource, videoOutput

options = {}
video_input='/dev/video0'

input_stream = videoSource(video_input, options=options)


video_output='rtsp://@:1234/output'
output_stream = videoOutput(video_output)

while 1:
    image = input_stream.Capture(format='rgb8', timeout=2500)
    output_stream.Render(image)

The result is that the camera displays fine in the OpenGL window but the RTSP server does not start successfully.

Here is the full output of the script:

python3 ./test.py
[gstreamer] initialized gstreamer, version 1.20.3.0
[gstreamer] gstCamera -- attempting to create device v4l2:///dev/video0
[gstreamer] gstCamera -- found v4l2 device: Logitech BRIO
[gstreamer] v4l2-proplist, device.path=(string)/dev/video2, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)"Logitech\ BRIO", v4l2.device.bus_info=(string)usb-3610000.usb-3.4.4, v4l2.device.version=(uint)331656, v4l2.device.capabilities=(uint)2225078273, v4l2.device.device_caps=(uint)69206017;
[gstreamer] gstCamera -- found v4l2 device: Logitech BRIO
[gstreamer] v4l2-proplist, device.path=(string)/dev/video0, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)"Logitech\ BRIO", v4l2.device.bus_info=(string)usb-3610000.usb-3.4.4, v4l2.device.version=(uint)331656, v4l2.device.capabilities=(uint)2225078273, v4l2.device.device_caps=(uint)69206017;
[gstreamer] gstCamera -- found 43 caps for v4l2 device /dev/video0
[gstreamer] [0] video/x-raw, format=(string)YUY2, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [1] video/x-raw, format=(string)YUY2, width=(int)1600, height=(int)896, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [2] video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [3] video/x-raw, format=(string)YUY2, width=(int)1024, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [4] video/x-raw, format=(string)YUY2, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [5] video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [6] video/x-raw, format=(string)YUY2, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [7] video/x-raw, format=(string)YUY2, width=(int)800, height=(int)448, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [8] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [9] video/x-raw, format=(string)YUY2, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [10] video/x-raw, format=(string)YUY2, width=(int)440, height=(int)440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
[gstreamer] [11] video/x-raw, format=(string)YUY2, width=(int)480, height=(int)270, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [12] video/x-raw, format=(string)YUY2, width=(int)340, height=(int)340, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
[gstreamer] [13] video/x-raw, format=(string)YUY2, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [14] video/x-raw, format=(string)YUY2, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [15] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [16] video/x-raw, format=(string)YUY2, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [17] video/x-raw, format=(string)YUY2, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [18] video/x-raw, format=(string)YUY2, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [19] video/x-raw, format=(string)NV12, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [20] video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [21] video/x-raw, format=(string)NV12, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [22] video/x-raw, format=(string)NV12, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [23] image/jpeg, width=(int)4096, height=(int)2160, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [24] image/jpeg, width=(int)3840, height=(int)2160, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [25] image/jpeg, width=(int)2560, height=(int)1440, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [26] image/jpeg, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [27] image/jpeg, width=(int)1600, height=(int)896, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [28] image/jpeg, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 90/1, 60/1, 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [29] image/jpeg, width=(int)1024, height=(int)576, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [30] image/jpeg, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [31] image/jpeg, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [32] image/jpeg, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [33] image/jpeg, width=(int)800, height=(int)448, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [34] image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 120/1, 90/1, 60/1, 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [35] image/jpeg, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [36] image/jpeg, width=(int)480, height=(int)270, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [37] image/jpeg, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [38] image/jpeg, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [39] image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [40] image/jpeg, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [41] image/jpeg, width=(int)176, height=(int)144, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] [42] image/jpeg, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 24/1, 20/1, 15/1, 10/1, 15/2, 5/1 };
[gstreamer] gstCamera -- selected device profile:  codec=raw format=yuyv width=1280 height=720 framerate=30
[gstreamer] gstCamera pipeline string:
[gstreamer] v4l2src device=/dev/video0 do-timestamp=true ! video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, framerate=30/1 ! appsink name=mysink sync=false
[gstreamer] gstCamera successfully created device v4l2:///dev/video0
[video]  created gstCamera from v4l2:///dev/video0
------------------------------------------------
gstCamera video options:
------------------------------------------------
 -- URI: v4l2:///dev/video0
    - protocol:  v4l2
    - location:  /dev/video0
 -- deviceType: v4l2
 -- ioType:     input
 -- codec:      raw
 -- codecType:  cpu
 -- width:      1280
 -- height:     720
 -- frameRate:  30
 -- numBuffers: 4
 -- zeroCopy:   true
 -- flipMethod: none
------------------------------------------------
[gstreamer] gstEncoder -- codec not specified, defaulting to H.264
[gstreamer] gstEncoder -- detected board 'NVIDIA Jetson AGX Orin Developer Kit'
[gstreamer] gstEncoder -- pipeline launch string:
[gstreamer] appsrc name=mysource is-live=true do-timestamp=true format=3 ! nvvidconv name=vidconv ! video/x-raw(memory:NVMM) ! nvv4l2h264enc name=encoder bitrate=4000000 insert-sps-pps=1 insert-vui=1 idrinterval=30 maxperf-enable=1 ! video/x-h264 ! rtph264pay config-interval=1 name=pay0
[rtsp]   waiting for RTSP server to start...
[rtsp]   RTSP server started @ rtsp://ubuntu:1234
[rtsp]   RTSP route added /output @ rtsp://ubuntu:1234
[video]  created gstEncoder from rtsp://@:1234/output
------------------------------------------------
gstEncoder video options:
------------------------------------------------
 -- URI: rtsp://@:1234/output
    - protocol:  rtsp
    - location:  0.0.0.0
    - port:      1234
 -- deviceType: ip
 -- ioType:     output
 -- codec:      H264
 -- codecType:  v4l2
 -- frameRate:  30
 -- bitRate:    4000000
 -- numBuffers: 4
 -- zeroCopy:   true
 -- latency     10
------------------------------------------------
[OpenGL] glDisplay -- X screen 0 resolution:  1920x1200
[OpenGL] glDisplay -- X window resolution:    1920x1200
[OpenGL] glDisplay -- display device initialized (1920x1200)
[video]  created glDisplay from display://0
------------------------------------------------
glDisplay video options:
------------------------------------------------
 -- URI: display://0
    - protocol:  display
    - location:  0
 -- deviceType: display
 -- ioType:     output
 -- width:      1920
 -- height:     1200
 -- frameRate:  0
 -- numBuffers: 4
 -- zeroCopy:   true
------------------------------------------------
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> v4l2src0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> v4l2src0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> v4l2src0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer message stream-start ==> pipeline0
[gstreamer] gstCamera -- onPreroll
[gstreamer] gstBufferManager recieve caps:  video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, colorimetry=(string)2:4:5:1
[gstreamer] gstBufferManager -- recieved first frame, codec=raw format=yuyv width=1280 height=720 size=1843200
[cuda]   allocated 4 ring buffers (1843200 bytes each, 7372800 bytes total)
[cuda]   allocated 4 ring buffers (8 bytes each, 32 bytes total)
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer message latency ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
[gstreamer] gstreamer message qos ==> v4l2src0
[cuda]   allocated 4 ring buffers (2764800 bytes each, 11059200 bytes total)
[OpenGL] glDisplay -- set the window size to 1280x720
[OpenGL] creating 1280x720 texture (GL_RGB8 format, 2764800 bytes)
[cuda]   registered openGL texture for interop access (1280x720, GL_RGB8, 2764800 bytes)
[cuda]   allocated 2 ring buffers (1382400 bytes each, 2764800 bytes total)
[gstreamer] gstEncoder -- starting pipeline, transitioning to GST_STATE_PLAYING
[gstreamer] gstreamer message qos ==> v4l2src0
Opening in BLOCKING MODE 
[gstreamer] gstreamer changed state from NULL to READY ==> pay0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter2
[gstreamer] gstreamer changed state from NULL to READY ==> encoder
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> vidconv
[gstreamer] gstreamer changed state from NULL to READY ==> mysource
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline1
[gstreamer] gstreamer changed state from READY to PAUSED ==> pay0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter2
[gstreamer] gstreamer changed state from READY to PAUSED ==> encoder
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> vidconv
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysource
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline1
[gstreamer] gstreamer message new-clock ==> pipeline1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pay0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> encoder
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> vidconv
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysource
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline1
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstEncoder -- new caps: video/x-raw, width=1280, height=720, format=(string)I420, framerate=30/1
NvMMLiteOpen : Block : BlockType = 4 
===== NvVideo: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
[gstreamer] gstreamer message latency ==> encoder
H264: Profile = 66 Level = 0 
NVMEDIA: Need to set EMC bandwidth : 376000 
NvVideo: bBlitMode is set to TRUE 
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer message qos ==> encoder
[gstreamer] gstreamer mysource ERROR Internal data stream error.
[gstreamer] gstreamer Debugging info: ../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline1/GstAppSrc:mysource:
streaming stopped, reason not-linked (-1)

hunterwn avatar Jun 03 '24 21:06 hunterwn

Is there any update on this?

InDroKalena avatar Jul 16 '24 20:07 InDroKalena

I had the exact same issue, webrtc works, but rtsp not with Internal data stream error. [gstreamer] gstreamer mysource ERROR Internal data stream error. [gstreamer] gstreamer Debugging info: gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline2/GstAppSrc:mysource: streaming stopped, reason not-linked (-1) With exact same options: output = videoOutput("webrtc://@:8554/output", options={'codec': 'h265', 'bitrate': 1000000}) output2 = videoOutput("rtsp://@:8555/output", options={'codec': 'h265', 'bitrate': 1000000})

YIFEI-MA avatar Jul 22 '24 08:07 YIFEI-MA

There is the same issue with all utility tools, including video-viewer, detectNet etc:

the output is from this command [$ video-viewer /dev/video0 rtsp://@:1234/my_output]

[gstreamer] gstreamer message qos ==> encoder
[gstreamer] gstreamer mysource ERROR Internal data stream error.
[gstreamer] gstreamer Debugging info: ../libs/gst/base/gstbasesrc.c(3127): gst_base_src_loop (): /GstPipeline:pipeline1/GstAppSrc:mysource:
streaming stopped, reason not-linked (-1)

pliolis avatar Aug 31 '24 14:08 pliolis

Hi guys, sorry haven't had time to dig into this - you can find where the RTSP pipeline is constructed here:

https://github.com/dusty-nv/jetson-utils/blob/e6aa502c9c612e0e89e513b7e60c5f98a755820f/codec/gstEncoder.cpp#L415C24-L415C28

My guess is that there were changes in the GstRtspServer plugins? The code that uses those is here:

https://github.com/dusty-nv/jetson-utils/blob/master/network/RTSPServer.cpp

dusty-nv avatar Aug 31 '24 16:08 dusty-nv

Is there any update on this?

pliolis avatar Oct 29 '24 09:10 pliolis

Is there any update on this? tks

JIA-HONG-CHU avatar Nov 20 '24 06:11 JIA-HONG-CHU

appsrc name=mysource is-live=true do-timestamp=true format=3 ! nvvidconv name=vidconv ! video/x-raw(memory:NVMM) ! nvv4l2h264enc name=encoder bitrate=4000000 insert-sps-pps=1 insert-vui=1 idrinterval=30 maxperf-enable=1 ! video/x-h264 ! queue ! rtph264pay config-interval=1 name=pay0

Hi @dusty-nv , Adding queue ! before rtph264pay has SOLVED video output issue for me. Add this in jetson-inference/utils/codec/gstEncoder.cpp then rebuild and install

ana-54n705h avatar Dec 05 '24 12:12 ana-54n705h

appsrc name=mysource is-live=true do-timestamp=true format=3 ! nvvidconv name=vidconv ! video/x-raw(memory:NVMM) ! nvv4l2h264enc name=encoder bitrate=4000000 insert-sps-pps=1 insert-vui=1 idrinterval=30 maxperf-enable=1 ! video/x-h264 ! queue ! rtph264pay config-interval=1 name=pay0

Hi @dusty-nv , Adding queue ! before rtph264pay has SOLVED video output issue for me. Add this in jetson-inference/utils/codec/gstEncoder.cpp then rebuild and install

Hi Ana, can you point out exactly how do patch this fix? looking at jetson-inference/utils/codec/gstEncoder.cpp i could not find the gstreamer string as the one that you posted, rather i could multiple parts scattered around the file.. could you be more precise and point exactly where the fix should be applied? thanks !

Bonitodelcapo avatar Jan 14 '25 13:01 Bonitodelcapo

Hi @Bonitodelcapo,

Please locate the following line in jetson-inference/utils/codec/gstEncoder.cpp at line 402:

ss << "rtph264pay"; Replace it with: ss << "queue ! rtph264pay";

After making this change, you can verify in the logs whether it is being applied in the GStreamer

ana-54n705h avatar Jan 16 '25 09:01 ana-54n705h

hi @ana-54n705h thank you for your reply. i made the changes and it is visible in the pipeline log

[gstreamer] gstEncoder -- codec not specified, defaulting to H.264 [gstreamer] gstEncoder -- detected board 'NVIDIA Orin NX Developer Kit' [gstreamer] gstEncoder -- pipeline launch string: [gstreamer] appsrc name=mysource is-live=true do-timestamp=true format=3 ! nvvidconv name=vidconv ! video/x-raw(memory:NVMM) ! nvv4l2h264enc name=encoder bitrate=4000000 insert-sps-pps=1 insert-vui=1 idrinterval=30 maxperf-enable=1 ! video/x-h264 ! queue ! rtph264pay config-interval=1 name=pay0 [rtsp] waiting for RTSP server to start... [rtsp] RTSP server started @ rtsp://orinNX:1234 [rtsp] RTSP route added /my_output @ rtsp://orinNX:1234 [video] created gstEncoder from rtsp://@:1234/my_output

but i still get the error...

[gstreamer] gstreamer mysource ERROR Internal data stream error. [gstreamer] gstreamer Debugging info: gstbasesrc.c(3072): gst_base_src_loop (): /GstPipeline:pipeline1/GstAppSrc:mysource: streaming stopped, reason not-linked (-1) [gstreamer] gstreamer queue0 ERROR Internal data stream error. [gstreamer] gstreamer Debugging info: gstqueue.c(988): gst_queue_handle_sink_event (): /GstPipeline:pipeline1/GstQueue:queue0: streaming stopped, reason not-linked (-1)

could you maybe post your full working log so that i can compare it? Thanks !

Bonitodelcapo avatar Jan 16 '25 11:01 Bonitodelcapo

@Bonitodelcapo, Just ignore warning messages for now, and check rtsp output video is coming or not.

ana-54n705h avatar Jan 16 '25 13:01 ana-54n705h

I’m still getting the error too.

pliolis avatar Feb 06 '25 19:02 pliolis

Hey, I fixed this error by recompile the jetson-utils, but you need to revise the gstEncoder.cpp in codec folder. In gstEncoder.cpp add

if( mOptions.resource.protocol == "rtsp" )
	ss << "queue max-size-buffers=1000 max-size-time=200000000 max-size-bytes=50000000 ! ";

between

else if( mOptions.codec == videoOptions::CODEC_MJPEG )
	ss << "! image/jpeg ! ";

and

if( mOptions.save.path.length() > 0 )
{
	ss << "tee name=savetee savetee. ! queue ! ";
	
	if( !gst_build_filesink(mOptions.save, mOptions.codec, ss) )
		return false;

	ss << "savetee. ! queue ! ";
}

JIA-HONG-CHU avatar Mar 31 '25 08:03 JIA-HONG-CHU