RootEncoder icon indicating copy to clipboard operation
RootEncoder copied to clipboard

How can i stream images obtained from TextureView

Open FawadAbbas12 opened this issue 1 year ago • 14 comments

Hello thanks for open-sourcing your implementation. I know it might be a quite basic question but i don't have much experience with android. Actually my scenario is that i have developed app for receiving dji drone video to my phone and i need to stream that video to my pc for further processing. DJI also provides rtmp streaming option but it is lagging alot i have tested your app and it give reasonable results for streaming videos. I can access bitmap show in texture view of current app but i am not sure how i can stream those frames using your rtmp api. if possible can you provide me some example or point to resource explaining streaming process like how it is reading video from file then i can change that implementation to read bitmap from my texture-view. Thanks

FawadAbbas12 avatar Nov 13 '23 09:11 FawadAbbas12

Is this the function in which i have to modify the code to send my ow image ?

private void decode() { . .

while (running) {
  synchronized (sync) {
    .
    .
    int inIndex = codec.dequeueInputBuffer(10000);
    int sampleSize = 0;
    if (inIndex >= 0) {
      ByteBuffer input;
      if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
        input = codec.getInputBuffer(inIndex);
      } else {
        input = codec.getInputBuffers()[inIndex];
      }
      .
      .
      .
      if (sampleSize < 0) {
        if (!loopMode) {
          codec.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
        }
      } else {
        codec.queueInputBuffer(inIndex, 0, sampleSize, ts + sleepTime, 0);
        extractor.advance();
      }
    }
    int outIndex = codec.dequeueOutputBuffer(bufferInfo, 10000);
    if (outIndex >= 0) {
      if (!sleep(sleepTime)) return;
      ByteBuffer output;
      if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
        output = codec.getOutputBuffer(outIndex);
      } else {
        output = codec.getOutputBuffers()[outIndex];
      }
      boolean render = decodeOutput(output);
      codec.releaseOutputBuffer(outIndex, render && bufferInfo.size != 0);
      boolean finished = extractor.getSampleTime() < 0;
      .
      .
    }
  }
}

}

FawadAbbas12 avatar Nov 13 '23 10:11 FawadAbbas12

Hello,

Since you already have h264/265 buffers usng DJI you can use rtmp module directly. You can follow the code implementation of other user and adapt it. You have an issue that seem to work using h264: https://github.com/pedroSG94/RootEncoder/issues/1311 Maybe you need to replace decodeSpsPpsFromByteArray because that method is for H265. You can try this one: https://github.com/pedroSG94/RootEncoder/blob/master/encoder/src/main/java/com/pedro/encoder/video/VideoEncoder.java#L362

pedroSG94 avatar Nov 13 '23 10:11 pedroSG94

Thanks i will try this but actually i only have bitmap obtained from texture view using following code

Bitmap bitmap = mVideoSurface.getBitmap();

FawadAbbas12 avatar Nov 13 '23 10:11 FawadAbbas12

Hello,

Use bitmap is not a good solution. Because that way

  • first you need convert to H264 or H265 (it is not easy)
  • The conversion is slow (low fps)
  • The way you get bitmap result in low fps

I recommend you read DJI documentation because I'm sure you can get H264/H265 buffers. I think this is the documentation that you need: https://developer.dji.com/api-reference-v5/android-api/Components/SDKManager/DJISDKManager.html

Try to find a way to get raw H264 buffers. Maybe this could help you: https://developer.dji.com/doc/mobile-sdk-tutorial/en/tutorials/video-stream.html

pedroSG94 avatar Nov 13 '23 12:11 pedroSG94

yes there is VideoFeeder.VideoDataListener() class to receive raw h264 data but apparently it is not working for mavic 2 pro :( that is why i thought of using bitmap but it is really slow to encode. i will try to get rawh264 images

FawadAbbas12 avatar Nov 13 '23 13:11 FawadAbbas12

hello @pedroSG94 i have managed to get raw buffer and i am using rtmp client as you had said but i am facing a slight issue. as of connection i gets open correctly but no data is getting transmitted can you have a look at my code and tell me what i am doing wrong ? I have also validated that the said callback function from dji is getting executed.

void onCreate(){
    //callbacks for rtmp client
    ConnectCheckerRtmp connectCheckerRtmp = new ConnectCheckerRtmp() {
        .
        .
        .
     }

    rtmpClient = new RtmpClient(connectCheckerRtmp);
    rtmpClient.setVideoResolution(640, 640);
    rtmpClient.setFps(30);
    rtmpClient.connect("rtmp://192.168.137.59/demo/d2");

    // This callback function returns raw h264 encoded arraylist 
    mReceivedVideoDataListener = new VideoFeeder.VideoDataListener() {
        @Override
        public void onReceive(byte[] videoBuffer /** H264 encoded buffer**/, int size) {
            DJIVideoStreamDecoder.getInstance().parse(videoBuffer, size);
            // here i an sending it via rtmp client
            rtmpClient.sendVideo(ByteBuffer.wrap(videoBuffer), new MediaCodec.BufferInfo());
        }
    };

}

FawadAbbas12 avatar Nov 16 '23 03:11 FawadAbbas12

Hello,

That code is imcomplete. You can check this example: https://github.com/pedroSG94/RootEncoder/issues/1033#issuecomment-1008749330 If you havent isIdr boolean you can manually check it like this: https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/base/recording/BaseRecordController.java#L60 This is the method decodeSpsPpsFromBuffer: https://github.com/pedroSG94/RootEncoder/blob/fdd495e6bfd114d54d35e57c888c5737b20f9dd2/encoder/src/main/java/com/pedro/encoder/video/VideoEncoder.java#L362

Also, remember use setOnlyVideo(true) (at the same moment you use setVideoResolution) if you are not using audio

pedroSG94 avatar Nov 16 '23 07:11 pedroSG94

Thanks i will add it :)

FawadAbbas12 avatar Nov 16 '23 08:11 FawadAbbas12

/** * Step: 1 creating the connection the with rtmp server.. Ngnix.. The Connectivity code is on OnCreate.. */

                    rtmpClient.connect("rtmp://192.168.137.59");

                    /**
                     * Step: 2
                     decoding sps pps from the video buffer, and setting the vps to null.. Then se the video info
                     */


                    Pair<ByteBuffer, ByteBuffer> buffers = decodeSpsPpsFromBuffer(ByteBuffer.wrap(videoBuffer), size);
                    if (buffers != null) {
                        Log.i(_TAG, "manual sps/pps extraction success");
                        sps = buffers.first;
                        pps = buffers.second;
                        vps = null;
                        spsPpsSetted = true;
                    }
                    else {
                        spsPpsSetted = false;
                    }

                    try {
                        rtmpClient.setVideoInfo(sps, pps, vps);
                        Log.i(_TAG, "set video info");
                    } catch (Exception ee) {
                        ee.getMessage();
                    }

                    /**
                     * Step: 3
                     send video buffer.. and also create a MediaCodec.BufferInfo object..

                     */


                    MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
                    try {
                        rtmpClient.sendVideo(ByteBuffer.wrap(videoBuffer), info);
                        long result = rtmpClient.getSentVideoFrames();
                        Log.d(_TAG, "" + result);
                    } catch (Exception ee) {
                        ee.getMessage();
                    }
                    

I have the same issue, but I have able to get the video buffer and from video buffer I was able to decode sps pps values, the main issue I am getting is that, I am getting 0 on rtmpClient.getSentVideoFrames();

ghost avatar Nov 28 '23 05:11 ghost

Hello,

You are not creating BufferInfo correctly. You need to update values properly. Check it in my example: https://github.com/pedroSG94/RootEncoder/issues/1033#issuecomment-1008749330

pedroSG94 avatar Nov 28 '23 08:11 pedroSG94

Yes I saw the comment, and I also try this but I was unable to get the keyframe, the nalutype I am getting 7 and then 1.

protected boolean isKeyFrame(ByteBuffer videoBuffer) {
    byte[] header = new byte[5];
    videoBuffer.duplicate().get(header, 0, header.length);
    if (videoMime.equals(CodecUtil.H264_MIME) && (header[4] & 0x1F) == RtpConstants.IDR) {  //h264
        return true;
    } else { //h265
        return videoMime.equals(CodecUtil.H265_MIME)
                && ((header[4] >> 1) & 0x3f) == RtpConstants.IDR_W_DLP
                || ((header[4] >> 1) & 0x3f) == RtpConstants.IDR_N_LP;
    }
}

in my cases this is always false. 

ghost avatar Nov 28 '23 09:11 ghost

I find similar problem in this #817 section, but also no solution was given, is there any other way to find out the keyframe.

ghost avatar Nov 28 '23 09:11 ghost

Hello,

If you are getting only 1 and 7 nal type you will need decode frames and re encode it to fix the problem. As you mentioned here: https://github.com/pedroSG94/RootEncoder/issues/817#issuecomment-1829543268

The code posted is the way to do it. I explain you the code.

That code decode frames provided by DJI in onReceive callback using DJICodecManager as a decode. Then DJICodecManager class is connected to my VideoEncoder class rendering the surface provided by glInterface that contain my VideoEncoder surface that automatically encode it. This way you can get frames from VideoEncoder using getVideoData interface.

pedroSG94 avatar Nov 28 '23 11:11 pedroSG94

Hello,

Remember change resolution in DJIExample class (width and height). Also check rtmpClient.setVideoResolution because 1080x720 is a weird resolution. Maybe you want to use 1280x720

pedroSG94 avatar Nov 30 '23 08:11 pedroSG94