StreamPack icon indicating copy to clipboard operation
StreamPack copied to clipboard

[Feat]: RTSP source

Open perotom opened this issue 7 months ago • 23 comments

Version

3.0.0-RC

Environment that reproduces the issue

Emulator

Use case description

I want to restream a local RTSP stream (CCTV camera) via SRT to the cloud to make it more reliable.

Proposed solution

I read the advanced docs (https://github.com/ThibaultBee/StreamPack/blob/main/docs/AdvancedStreamer.md) and combined it with https://github.com/alexeyvasilyev/rtsp-client-android to get low latency stream. I want to display the stream and forward it via SRT.

So far I have built a custom class for this:

import android.net.Uri
import android.view.Surface
import com.alexvas.rtsp.codec.VideoDecodeThread.DecoderType
import com.alexvas.rtsp.codec.VideoDecoderSurfaceThread
import com.alexvas.rtsp.widget.RtspProcessor
import io.github.thibaultbee.streampack.core.elements.processing.video.source.DefaultSourceInfoProvider
import io.github.thibaultbee.streampack.core.elements.processing.video.source.ISourceInfoProvider
import io.github.thibaultbee.streampack.core.elements.sources.video.ISurfaceSourceInternal
import io.github.thibaultbee.streampack.core.elements.sources.video.IVideoSourceInternal
import io.github.thibaultbee.streampack.core.elements.sources.video.VideoSourceConfig
import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow
import kotlinx.coroutines.withContext

/**
 *  Minimal head-less RTSP client that decodes *video-only* into the encoder Surface.
 *
 *  Scope: H.264/H.265 over TCP, no audio, low-latency.
 */
class RtspClientSurfaceSource(
    val rtspUri: Uri,
    private val username: String? = null,
    private val password: String? = null,
    private val userAgent : String? = "StreamPack-RtspClient"
) : IVideoSourceInternal, ISurfaceSourceInternal {

    /* ---------- required StreamPack flows ---------- */
    private val _infoProviderFlow    = MutableStateFlow<ISourceInfoProvider>(DefaultSourceInfoProvider())
    override val infoProviderFlow: StateFlow<ISourceInfoProvider> = _infoProviderFlow

    private val _isStreaming         = MutableStateFlow(false)
    override val isStreamingFlow: StateFlow<Boolean>              = _isStreaming

    override val timestampOffsetInNs = 0L

    /* ---------- rtsp-client objects ---------- */
    private var encoderSurface : Surface?        = null
    private var rtspProcessor  : RtspProcessor?  = null

    /* ---------------------------------------------------------------------- */
    override suspend fun configure(config: VideoSourceConfig) {
        /* no-op – we create the processor lazily in startStream() once the
           encoder Surface has been provided via setOutput().                */
    }

    override suspend fun setOutput(surface: Surface) { encoderSurface = surface }

    override suspend fun getOutput(): Surface?     = encoderSurface
    override suspend fun resetOutput()             { encoderSurface = null }

    override suspend fun startStream() {
        check(! _isStreaming.value)          { "already started" }
        val surface = encoderSurface ?: error("setOutput() must be called first")

        // All rtsp-client operations must run on the main thread
        withContext(Dispatchers.Main.immediate) {
            rtspProcessor = RtspProcessor(
                onVideoDecoderCreateRequested = { mime, rot, queue, listener, _ ->
                    VideoDecoderSurfaceThread(
                        surface, mime,
                        /* dummy size – codec derives from SPS */ 1920, 1080,
                        rot, queue, listener, DecoderType.HARDWARE
                    )
                }
            ).apply {
                // absolutely *video only* for this source
                init(rtspUri, username, password, userAgent)
                start(requestVideo = true, requestAudio = false)
            }
        }
        _isStreaming.value = true
    }

    override suspend fun stopStream() {
        rtspProcessor?.stop()
        _isStreaming.value = false
    }

    override fun release() {
        rtspProcessor?.stop()
        rtspProcessor = null
        encoderSurface = null
    }
}

The error I get from this is:

/home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/core.cpp@6868:sendmsg2 08:21:54.643737/DefaultDispatch*E:SRT.as: @854765524: Wrong source time was provided. Sending is rejected.


onStreamError: java.net.SocketException: Operation not supported: Incorrect use of Message API
	io.github.thibaultbee.streampack.core.elements.endpoints.composites.sinks.ClosedException: java.net.SocketException: Operation not supported: Incorrect use of Message API (sendmsg/recvmsg)
		at io.github.thibaultbee.streampack.ext.srt.internal.endpoints.composites.sinks.SrtSink.write(SrtSink.kt:140)
		at io.github.thibaultbee.streampack.ext.srt.internal.endpoints.composites.sinks.SrtSink$write$1.invokeSuspend(Unknown Source:15)
		at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
		at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:98)
		at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:263)
		at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:95)
		at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:69)
		at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source:1)
		at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:47)
		at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source:1)
		at io.github.thibaultbee.streampack.core.elements.endpoints.composites.CompositeEndpoint$1.onOutputFrame(CompositeEndpoint.kt:56)
		at io.github.thibaultbee.streampack.core.elements.endpoints.composites.muxers.ts.utils.TSOutputCallback.writePacket(TSOutputCallback.kt:24)
		at io.github.thibaultbee.streampack.core.elements.endpoints.composites.muxers.ts.packets.TS.write(TS.kt:133)
		at io.github.thibaultbee.streampack.core.elements.endpoints.composites.muxers.ts.packets.Pes.write(Pes.kt:51)
		at io.github.thibaultbee.streampack.core.elements.endpoints.composites.muxers.ts.TsMuxer.generateStreams(TsMuxer.kt:161)
		at io.github.thibaultbee.streampack.core.elements.endpoints.composites.muxers.ts.TsMuxer.write(TsMuxer.kt:148)
		at io.github.thibaultbee.streampack.core.elements.endpoints.composites.CompositeEndpoint.write(CompositeEndpoint.kt:77)
		at io.github.thibaultbee.streampack.core.elements.endpoints.DynamicEndpoint.write$suspendImpl(DynamicEndpoint.kt:101)
		at io.github.thibaultbee.streampack.core.elements.endpoints.DynamicEndpoint.write(Unknown Source:0)
		at io.github.thibaultbee.streampack.core.pipelines.outputs.encoding.EncodingPipelineOutput$videoEncoderListener$1$onOutputFrame$1$1.invokeSuspend(EncodingPipelineOutput.kt:244)
		at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
		at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:100)
		at kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:263)
		at kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:95)
		at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:69)
		at kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source:1)
		at kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:47)
		at kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source:1)
		at io.github.thibaultbee.streampack.core.pipelines.outputs.encoding.EncodingPipelineOutput$videoEncoderListener$1.onOutputFrame(EncodingPipelineOutput.kt:243)
		at io.github.thibaultbee.streampack.core.elements.encoders.mediacodec.MediaCodecEncoder.processOutputFrameSync$lambda$9(MediaCodecEncoder.kt:371)
		at io.github.thibaultbee.streampack.core.elements.encoders.mediacodec.MediaCodecEncoder.$r8$lambda$JfRWavPzZuJnPtH_AZ7qsi0CZp8(Unknown Source:0)
		at io.github.thibaultbee.streampack.core.elements.encoders.mediacodec.MediaCodecEncoder$$ExternalSyntheticLambda5.run(D8$$SyntheticClass:0)
		at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
		at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
		at java.lang.Thread.run(Thread.java:1012)
Caused by: java.net.SocketException: Operation not supported: Incorrect use of Message API (sendmsg/recvmsg)
	at io.github.thibaultbee.srtdroid.core.models.SrtSocket.send(SrtSocket.kt:684)
	at io.github.thibaultbee.srtdroid.ktx.CoroutineSrtSocket.send$lambda$4(CoroutineSrtSocket.kt:389)
	at io.github.thibaultbee.srtdroid.ktx.CoroutineSrtSocket.$r8$lambda$R-v6WsWB6ftIkiuK0jEZhbCNRms(Unknown Source:0)
	at io.github.thibaultbee.srtdroid.ktx.CoroutineSrtSocket$$ExternalSyntheticLambda3.invoke(D8$$SyntheticClass:0)
	at io.github.thibaultbee.srtdroid.ktx.CoroutineSrtSocket.executeEpoll(CoroutineSrtSocket.kt:866)
	at io.github.thibaultbee.srtdroid.ktx.CoroutineSrtSocket.executeEpollWithTimeout(CoroutineSrtSocket.kt:819)
	at io.github.thibaultbee.srtdroid.ktx.CoroutineSrtSocket.access$executeEpollWithTimeout(CoroutineSrtSocket.kt:48)
	at io.github.thibaultbee.srtdroid.ktx.CoroutineSrtSocket$execute$3.invokeSuspend(CoroutineSrtSocket.kt:797)
	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)

Any ideas what to do?

Alternative solutions

ChatGPT told me the following solutions, although they are incorrect. A. is a hallucination. Not sure about B. and if this is really the root cause.

Path What it does How to apply
A. Turn off the Message-API StreamPack will call the classic send() instead of sendmsg2; no timestamps are sent, so SRT can’t complain. Create the SrtMediaDescriptor with messageApi = false (newer StreamPack versions default to true).
B. Keep Message-API but send valid timestamps You’d patch StreamPack to set srcTime = SRT.timeNow() for every packet. Requires editing the library (not recommended just to unblock you).

perotom avatar May 21 '25 06:05 perotom

Hello, Nice project you got there.

The error is that libsrt uses capture timestamp and compares with it own clock (monotonic) and it throws when timestamp of the packet is late or in advance.

A. is possible throught the SrtUrl but you will loose what makes srt insteresting. See https://github.com/ThibaultBee/srtdroid/blob/c97de81d81705b37360b1073a5e6ab7b1401b377/srtdroid-core/src/main/java/io/github/thibaultbee/srtdroid/core/models/SrtUrl.kt#L209

ChatGPT B would be the best solution. But it is easier that what chatGPT said. Use the timestampOffsetInNs to align the timestamp packet with the internal monotonic clock. Do you have access to the clock or to the timestamp of the first packet?

ThibaultBee avatar May 21 '25 07:05 ThibaultBee

Thanks for the information. So I guess the capture timestamp comes from the camera. I think it is impossible to keep the camera clock in sync with the phone (or to a fixed offset as clock will drift over time). Sadly I can't think of a way to access the timestamp of the first packet in a convenient way. I guess I would have to open the stream, manually decode the packet, close the stream and then start the streamer. Not sure if this would work.

Can you imagine introducing an override-able property for setting the capture time to now?

perotom avatar May 21 '25 07:05 perotom

Does the RtspProcessor have a timestamp member or let you have access to the timestamp?

ThibaultBee avatar May 21 '25 09:05 ThibaultBee

Yes, I found a way to access the first timestamp. In my example it is an arbitrary number (matches RTP as this number will not be wall clock) something like 4923399 or 1157855.

override suspend fun startStream() {
    check(! _isStreaming.value)          { "already started" }
    val surface = encoderSurface ?: error("setOutput() must be called first")

    // All rtsp-client operations must run on the main thread
    withContext(Dispatchers.Main.immediate) {
        rtspProcessor = RtspProcessor(
            onVideoDecoderCreateRequested = { mime, rot, queue, listener, _ ->
                VideoDecoderSurfaceThread(
                    surface, mime,
                    /* dummy size – codec derives from SPS */ 1920, 1080,
                    rot, queue, listener, DecoderType.HARDWARE
                )
            }
        ).apply {
            // use the data listener to get the timestamp of the first frame
            dataListener = object : RtspDataListener {
                override fun onRtspDataVideoNalUnitReceived(
                    data: ByteArray,
                    offset: Int,
                    length: Int,
                    timestamp: Long
                ) {
                    if (!timestampFirstFrame) {
                        timestampFirstFrame = true
                        println("FOUND FIRST FRAME")
                        println(timestamp)
                    }
                }

                /* we ignore audio & application streams */
                override fun onRtspDataAudioSampleReceived(data: ByteArray, offset: Int, length: Int, timestamp: Long) = Unit
                override fun onRtspDataApplicationDataReceived(data: ByteArray, offset: Int, length: Int, timestamp: Long) = Unit
            }
            // absolutely *video only* for this source
            init(rtspUri, username, password, userAgent)
            start(requestVideo = true, requestAudio = false)
        }
    }
    _isStreaming.value = true
}

How do I use this now to calculate the offset and set it? I saw timestampOffsetInNs is a val not a var so I can't assign this value when I discovered the timestamp.

perotom avatar May 21 '25 10:05 perotom

You can use the getter syntax like this:

private var _timestampOffsetInNs
override val timestampOffsetInNs: Long
     get() = _timestampOffsetInNs

The addition is made there: https://github.com/ThibaultBee/StreamPack/blob/c0002e97004edd00ac4598ea8590c28449edb266/core/src/main/java/io/github/thibaultbee/streampack/core/elements/processing/video/SurfaceProcessor.kt#L189

It is something like that:

_timestampOffsetInNs = TimeUtils.currentTime() - timestampFirstFrame

ThibaultBee avatar May 21 '25 12:05 ThibaultBee

Neat trick! I am for sure no Kotlin expert. Thank you! I added but still get the same issue. I had to replace TimeUtils.currentTime() with System.nanoTime(). So the calculation looks like this now:

override fun onRtspDataVideoNalUnitReceived(
    data: ByteArray,
    offset: Int,
    length: Int,
    timestamp: Long
) {
    if (!timestampFirstFrame) {
        timestampFirstFrame = true
        _timestampOffsetInNs = System.nanoTime() -(timestamp * 1000)
    }
}

Full file:

import android.net.Uri
import android.view.Surface
import com.alexvas.rtsp.codec.VideoDecodeThread.DecoderType
import com.alexvas.rtsp.codec.VideoDecoderSurfaceThread
import com.alexvas.rtsp.widget.RtspDataListener
import com.alexvas.rtsp.widget.RtspProcessor
import io.github.thibaultbee.streampack.core.elements.processing.video.source.DefaultSourceInfoProvider
import io.github.thibaultbee.streampack.core.elements.processing.video.source.ISourceInfoProvider
import io.github.thibaultbee.streampack.core.elements.sources.video.ISurfaceSourceInternal
import io.github.thibaultbee.streampack.core.elements.sources.video.IVideoSourceInternal
import io.github.thibaultbee.streampack.core.elements.sources.video.VideoSourceConfig
import kotlinx.coroutines.Dispatchers
import kotlinx.coroutines.flow.MutableStateFlow
import kotlinx.coroutines.flow.StateFlow
import kotlinx.coroutines.withContext

/**
 *  Minimal head-less RTSP client that decodes *video-only* into the encoder Surface.
 *
 *  Scope: H.264/H.265 over TCP, no audio, low-latency.
 */
class RtspClientSurfaceSource(
    val rtspUri: Uri,
    private val username: String? = null,
    private val password: String? = null,
    private val userAgent : String? = "StreamPack-RtspClient"
) : IVideoSourceInternal, ISurfaceSourceInternal {

    /* ---------- required StreamPack flows ---------- */
    private val _infoProviderFlow    = MutableStateFlow<ISourceInfoProvider>(DefaultSourceInfoProvider())
    override val infoProviderFlow: StateFlow<ISourceInfoProvider> = _infoProviderFlow

    private val _isStreaming         = MutableStateFlow(false)
    override val isStreamingFlow: StateFlow<Boolean>              = _isStreaming

    private var _timestampOffsetInNs: Long = 0
    override val timestampOffsetInNs: Long
        get() = _timestampOffsetInNs
    private var timestampFirstFrame = false

    /* ---------- rtsp-client objects ---------- */
    private var encoderSurface : Surface?        = null
    private var rtspProcessor  : RtspProcessor?  = null

    /* ---------------------------------------------------------------------- */
    override suspend fun configure(config: VideoSourceConfig) {
        /* no-op – we create the processor lazily in startStream() once the
           encoder Surface has been provided via setOutput().                */
    }

    override suspend fun setOutput(surface: Surface) { encoderSurface = surface }

    override suspend fun getOutput(): Surface?     = encoderSurface
    override suspend fun resetOutput()             { encoderSurface = null }

    override suspend fun startStream() {
        check(! _isStreaming.value)          { "already started" }
        val surface = encoderSurface ?: error("setOutput() must be called first")

        // All rtsp-client operations must run on the main thread
        withContext(Dispatchers.Main.immediate) {
            rtspProcessor = RtspProcessor(
                onVideoDecoderCreateRequested = { mime, rot, queue, listener, _ ->
                    VideoDecoderSurfaceThread(
                        surface, mime,
                        /* dummy size – codec derives from SPS */ 1920, 1080,
                        rot, queue, listener, DecoderType.HARDWARE
                    )
                }
            ).apply {
                // use the data listener to get the timestamp of the first frame
                dataListener = object : RtspDataListener {
                    override fun onRtspDataVideoNalUnitReceived(
                        data: ByteArray,
                        offset: Int,
                        length: Int,
                        timestamp: Long
                    ) {
                        if (!timestampFirstFrame) {
                            timestampFirstFrame = true
                            _timestampOffsetInNs = System.nanoTime() -(timestamp * 1000)
                        }
                    }

                    /* we ignore audio & application streams */
                    override fun onRtspDataAudioSampleReceived(data: ByteArray, offset: Int, length: Int, timestamp: Long) = Unit
                    override fun onRtspDataApplicationDataReceived(data: ByteArray, offset: Int, length: Int, timestamp: Long) = Unit
                }
                // absolutely *video only* for this source
                init(rtspUri, username, password, userAgent)
                start(requestVideo = true, requestAudio = false)
            }
        }
        _isStreaming.value = true
    }

    override suspend fun stopStream() {
        rtspProcessor?.stop()
        _isStreaming.value = false
    }

    override fun release() {
        rtspProcessor?.stop()
        rtspProcessor = null
        encoderSurface = null
    }
}

perotom avatar May 21 '25 12:05 perotom

You are right currentTime is in us not ns. Great to header you succeed :D

ThibaultBee avatar May 21 '25 12:05 ThibaultBee

Thanks for guidance! Sadly it still gives me the same error. Any ideas what to do or where to start?

perotom avatar May 21 '25 13:05 perotom

Log the packet timestamp in the SrtSink and compares it with Time.now() from srtdroid

ThibaultBee avatar May 21 '25 13:05 ThibaultBee

I see, so I will need to fork this repo and install it locally. This will take some time. I guess I will need to insert it here: https://github.com/ThibaultBee/StreamPack/blob/c0002e97004edd00ac4598ea8590c28449edb266/extensions/srt/src/main/java/io/github/thibaultbee/streampack/ext/srt/internal/endpoints/composites/sinks/SrtSink.kt#L131

something like Log.d("SrtSink", "Now: ${Instant.now().toEpochMilli() * 1000}, Timestamp: ${packet.ts}")

perotom avatar May 21 '25 14:05 perotom

Log.d("SrtSink", "Now: ${Time.now()}, Timestamp: ${packet.ts}") instead.

ThibaultBee avatar May 21 '25 15:05 ThibaultBee

Are you sure the timestamp is in us ?

ThibaultBee avatar May 21 '25 15:05 ThibaultBee

You mean coming from onRtspDataVideoNalUnitReceived? As far as I understood it comes from the RTP header (camera). I am not an expert but ChatGPT delivers me this:

RTP Timestamp Unit: • No fixed unit like seconds or milliseconds. • The “unit” is one tick of the media’s clock, which varies depending on the codec. So, timestamp values are just counters—e.g., if a video frame is captured every 1/30th of a second (30 fps), the timestamp for each frame would increment by 90000 / 30 = 3000 units per frame.

How Do You Know the Clock Rate? • It is defined in the RTP payload format specification. • Often signaled via SDP (Session Description Protocol) in RTSP setups. For example: m=video 5004 RTP/AVP 96 a=rtpmap:96 H264/90000

Here, H264/90000 means the timestamp clock rate is 90,000 Hz.

Seems like I need to do RTP stream inspection to figure out the clock. This makes it quite complicated I guess.

perotom avatar May 21 '25 15:05 perotom

As I need to fork this repo anyways, I think I will just go the simple route and set every packet to Time.now(). Do you see any disadvantages with that?

perotom avatar May 22 '25 11:05 perotom

As you don't need audio, I don't.

I would be nice to understand the reason behind. Does rtsp-client-android properly write the timestamp to the surface?

ThibaultBee avatar May 22 '25 11:05 ThibaultBee

I think I want to add audio later on, so lets try to find a proper solution then.

I took the rtsp-client-android sample app and figured out that the timestamp delivered by onRtspDataVideoNalUnitReceived is indeed uS. It represents the uS since the connection was opened (so also on reconnect, it starts at 0 again).

Log output:

Log.i("RtspClientSurfaceSource","First timestamp at $timestamp, diff is ${_timestampOffsetInNs}ns")
                        
2025-05-22 14:32:56.424 20252-20333 RtspClientSurfaceSource First timestamp at 1613599, diff is 1747917174810401000ns

Does it mean _timestampOffsetInNs = System.nanoTime() - (timestamp * 1000) should work (I tried - it doesn't)? The _timestampOffsetInNs contains now the unix timestamp in nano seconds of the first frame.

perotom avatar May 22 '25 12:05 perotom

If 1613599 is in us and is the first timestamp since the connection that means it takes more than 1s to send a frame. Do you have the connection timestamp? Can you access the timestamp of the surface?

If you fork StreamPack, you can also log timestamp and offset there: https://github.com/ThibaultBee/StreamPack/blob/c0002e97004edd00ac4598ea8590c28449edb266/core/src/main/java/io/github/thibaultbee/streampack/core/elements/processing/video/SurfaceProcessor.kt#L189

ThibaultBee avatar May 22 '25 12:05 ThibaultBee

hmm isn't the offset = 0 when createInputSurface is called.

Can you replace with:

 private var _timestampOffsetInNs: Long? = null
    override val timestampOffsetInNs: Long
        get() = _timestampOffsetInNs ?: throw IllegalStateException("Timestamp offset not set")

ThibaultBee avatar May 22 '25 12:05 ThibaultBee

If 1613599 is in us and is the first timestamp since the connection that means it takes more than 1s to send a frame. Do you have the connection timestamp? Can you access the timestamp of the surface?

If you fork StreamPack, you can also log timestamp and offset there:

StreamPack/core/src/main/java/io/github/thibaultbee/streampack/core/elements/processing/video/SurfaceProcessor.kt

Line 189 in c0002e9

val timestamp =

  1. Yes, looks like it takes little more than 1s.
  2. I will need to take a look but I guess the problem lies down below:

hmm isn't the offset = 0 when createInputSurface is called.

Can you replace with:

 private var _timestampOffsetInNs: Long? = null
    override val timestampOffsetInNs: Long
        get() = _timestampOffsetInNs ?: throw IllegalStateException("Timestamp offset not set")

I did and the exception happens before the timestamp is set so I guess it is set too late. Does it mean this concept is not working? Or is the issue that the RTSP seems to be consumed only after the SRT was started?

perotom avatar May 22 '25 13:05 perotom

This is something I haven't thought about. I need few days to find a fix.

As a workaround, could you try to hardcode timestampOffsetInNs:

 override val timestampOffsetInNs = System.nanoTime() - 1_610_000

ThibaultBee avatar May 22 '25 13:05 ThibaultBee

Great idea, now the error is gone and I see the video stream on the app.

I now see it is streaming to our servers (with 4MBit) but I can't open the stream with VLC. I also see in the console the following error popping up. It is a little bit strange that I can see the video on the app but the stream doesn't open right?

2025-05-22 15:51:49.062 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:51:49.061691/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E129 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:51:49.876 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:51:49.876416/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E465 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:51:49.887 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:51:49.887811/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E465 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:51:49.894 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:51:49.894975/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E465 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:51:52.930 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:51:52.930180/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E1661 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:51:53.133 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:51:53.133929/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E1802 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:51:56.841 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:51:56.841964/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E2974 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:51:56.858 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:51:56.858022/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E2974 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:52:05.940 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:52:05.940467/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E6636 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:52:44.851 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:52:44.850912/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E21906 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:53:00.424 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:53:00.424090/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E27875 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:53:02.320 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:53:02.319874/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E28528 which is NOT SUBSCRIBED to @711035142
2025-05-22 15:53:02.320 21162-21214 libsrt E  /home/runner/work/srtdroid/srtdroid/srtdroid-core/.cxx/RelWithDebInfo/6h4y1g40/arm64-v8a/srt_project-prefix/src/srt_project/srtcore/epoll.cpp@900:update_events 15:53:02.320838/SRT:RcvQ:w1*E:SRT.ei: epoll/update: IPE: update struck E28528 which is NOT SUBSCRIBED to @711035142

My camera settings:

Image

perotom avatar May 22 '25 14:05 perotom

It could be that frame are too late and libsrt drops them. Look at the srt stats, Increasing srt latency might be a workaround here.

I have the feeling that you require specific support. Could you contact me on Linkedin?

ThibaultBee avatar May 22 '25 14:05 ThibaultBee

Yes good idea, I did.

perotom avatar May 23 '25 05:05 perotom