Stream-USB-test icon indicating copy to clipboard operation
Stream-USB-test copied to clipboard

How to process frames from the USB camera

Open publioelon opened this issue 3 years ago • 6 comments

Hi, I am using a thermal camera and by the default the raw image needs to be converted to grey scale. I managed to get the library working, but I need to use OpenCV to process the frames before they are streamed, I'd like to know how can I get the frame before it is streamed, so I can call it with a function?

publioelon avatar Apr 12 '22 02:04 publioelon

You will need get frames in the uvccamera library. That library produce mpeg or yuv frames in c++ code and draw it in a SurfaceTexture.

pedroSG94 avatar Apr 12 '22 08:04 pedroSG94

Hi @pedroSG94, the function that gives me a frame from the UVCCamera as per the documentation is this one:

    private byte[] FrameData = new byte[384 * 292 * 4];
    private final IFrameCallback mIFrameCallback = new IFrameCallback() {
        @Override
        public void onFrame(final ByteBuffer frameData) {
            frameData.clear();
            frameData.get(FrameData, 0, frameData.capacity());
                }

            }
    };

It does offer either MPEG or YUV as you said, but how to draw these frames to SurfaceTexture to be streamed?

puelon avatar Apr 21 '22 05:04 puelon

You have 2 ways:

  • Render an opengl SurfaceTexture doing YUV to RGB conversion in a fragment.glsl
  • Encode YUV to H264 directly and set an output Surface of your preview to encoder if preview is needed.

You have other ways but with lower performance like use canvas to render a Surface or convert to bitmap and draw it. This is all assuming that you receive YUV. If you only can receive MPEG I think that the best way is decode and encode it.

pedroSG94 avatar Apr 21 '22 09:04 pedroSG94

Hey @pedroSG94, thanks for the reply. My problem is that even though it's YUV, I would still need to add false colors to the image, so I would need a way to process it before preview. I was wondering if it's possible to work with TextureView? I think it would be easier and faster since my frames are already processed and displayed using TextureView.

Above the function replaceGlInterface there's a comment:

/** * Replace glInterface used on fly. Ignored if you use SurfaceView or TextureView */

I am not sure what it means, but is there any way that I could stream the images from a TextureView instead of a OpenGlView? If not, is there any way to have a TextureView to copy over to the OpenGLView for streaming?

puelon avatar Apr 27 '22 06:04 puelon

replaceGlInterface is only available if you are using OpenGlView, OffScreenGlThread or LightOpenGlView because this method basically replace the opengl renderer on fly.

I suggest you draw YUV bytes in TextureView and send that bytes to VideoEncoder like I'm doing here: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/encoder/src/main/java/com/pedro/encoder/input/video/Camera1ApiManager.java#L398

You can do exactly the same that Camera1 with TextureView that get YUV data from camera and send it to VideoEncoder.

pedroSG94 avatar Apr 27 '22 08:04 pedroSG94

replaceGlInterface is only available if you are using OpenGlView, OffScreenGlThread or LightOpenGlView because this method basically replace the opengl renderer on fly.

I suggest you draw YUV bytes in TextureView and send that bytes to VideoEncoder like I'm doing here: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/encoder/src/main/java/com/pedro/encoder/input/video/Camera1ApiManager.java#L398

You can do exactly the same that Camera1 with TextureView that get YUV data from camera and send it to VideoEncoder.

Thanks for the guidance. I managed to setup my camera so it displays MJPEG on a TextureView, So I can successfully process the images to greyscale or other colormaps as I mentioned before, now I just gotta figure a way to stream it. So, it may be for the best if I try to continue to use YUV as a second option as it may add more latency depending on what I can do with it.

What I did was to add in my project the rtplibrary and set it to 1.5.0 as you mentioned in another issue and created added two classes USBBase.java and RtmpUSB.java based on this project , streaming a green Image worked for me, but I fell on the same greyscale processing problem, the only difference is that they're using a OpenGlView. Down below is my current code with MJPEG it works perfectly, for displaying the images and processing, but needs streaming now:

public class MainActivity extends Activity {
    private int PREVIEW_WIDTH = 384;
    private int PREVIEW_HEIGHT = 292;
    private SimpleUVCCameraTextureView mUVCCameraView;
    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main2);
        Log.e(TAG, "onCreate:");
        mUVCCameraView = (SimpleUVCCameraTextureView) findViewById(R.id.camera_view123);
        mUVCCameraView.setAspectRatio(PREVIEW_WIDTH / (float) PREVIEW_HEIGHT);

    }
        @Override
        public void onConnect(UsbDevice device, USBMonitor.UsbControlBlock ctrlBlock, boolean createNew) {
            Log.e(TAG, "onConnect:");
            handleOpen(ctrlBlock);
            mUVCCamera = new UVCCamera(0);
            mUVCCamera.open(ctrlBlock);
            startPreview();
            Handler handler = new Handler();
            handler.postDelayed(new Runnable() {
                @Override
                public void run() {
                    UVCsetValue(UVCCamera.CTRL_ZOOM_ABS, 0x8004);
                }
            }, 300);
        }

    private void startPreview() {
        SurfaceTexture st = mUVCCameraView.getSurfaceTexture();
        mPreviewSurface = new Surface(st);
        handleStartPreview(mPreviewSurface);
    }
    private UVCCamera mUVCCamera;

    public void handleStartPreview(Object surface) {
        if ((mUVCCamera == null)) return;
        try {
            mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, UVCCamera.FRAME_FORMAT_MJPEG, UVCCamera.DEFAULT_BANDWIDTH, 0);
        } catch (IllegalArgumentException e) {
            try {
                // fallback to YUV mode
                mUVCCamera.setPreviewSize(mWidth, mHeight, 1, 26, UVCCamera.DEFAULT_PREVIEW_MODE, UVCCamera.DEFAULT_BANDWIDTH, 0);
                Log.e(TAG, "handleStartPreview4");
            } catch (IllegalArgumentException e1) {
                callOnError(e1);
                return;
            }
        }
        if (surface instanceof SurfaceHolder) {
            mUVCCamera.setPreviewDisplay((SurfaceHolder) surface);
        } else if (surface instanceof Surface) {
            Log.e(TAG, "Surface:");
            mUVCCamera.setPreviewDisplay((Surface) surface);
        } else if (surface instanceof SurfaceTexture) {
            mUVCCamera.setPreviewTexture((SurfaceTexture) surface);
        }
        int result = mUVCCamera.startPreview();
    }


}

I think I almost got it, How can I use the RtmpUSB and USBBase classes to work with a TextureView surface? or Do I necessarily need a OpenGlView?

puelon avatar Apr 27 '22 20:04 puelon