webrtc-android
webrtc-android copied to clipboard
YUV/NV21 drawing
I have some rendering issue in fact isolated from WebRTC protocol, but related to this lib... I want to draw raw YUV frames, which are live video Full HD 30 fps obtained from some 3rd party
I've placed SurfaceViewRenderer in my Fragment and in onCreateView
mSurfaceViewRenderer = rootView.findViewById(R.id.surfaceViewRenderer);
mSurfaceViewRenderer.init(EglBase.create().getEglBaseContext(), null);
and I'm trying to feed this surface with prepared frames (callback called in own Thread, not main) like below:
@Override
public void onYuvDataByteBuffer(MediaFormat mediaFormat, ByteBuffer data, int dataSize, int width, int height) {
int rowStrideY = width;
int rowStrideU = width / 2;
int rowStrideV = width / 2;
int basicOffset = data.remaining() / 6;
int offsetY = 0;
int offsetU = basicOffset * 4;
int offsetV = basicOffset * 5;
ByteBuffer i420ByteBuffer = data;
i420ByteBuffer.position(offsetY);
final ByteBuffer dataY = i420ByteBuffer.slice();
i420ByteBuffer.position(offsetU);
final ByteBuffer dataU = i420ByteBuffer.slice();
i420ByteBuffer.position(offsetV);
final ByteBuffer dataV = i420ByteBuffer.slice();
JavaI420Buffer javaI420Buffer = JavaI420Buffer.wrap(width, height,
dataY, rowStrideY,
dataU, rowStrideU,
dataV, rowStrideV,
null
/*() -> {
JniCommon.nativeFreeByteBuffer(i420ByteBuffer);
}*/);
VideoFrame frame = new VideoFrame(javaI420Buffer, 0, System.currentTimeMillis());
mSurfaceViewRenderer.onFrame(frame);
}
yes, some ugly hardcodes for now, but it works, kind of... first frame is rendered properly, no issues. but second call will further cause a throw
FATAL EXCEPTION: SurfaceViewRendererEglRenderer
Process: thats.my.package, PID: 12970
java.lang.IllegalStateException: buffer is inaccessible
at java.nio.DirectByteBuffer.slice(DirectByteBuffer.java:159)
at org.webrtc.JavaI420Buffer.getDataY(JavaI420Buffer.java:118)
at org.webrtc.VideoFrameDrawer$YuvUploader.uploadFromBuffer(VideoFrameDrawer.java:114)
at org.webrtc.VideoFrameDrawer.drawFrame(VideoFrameDrawer.java:221)
at org.webrtc.EglRenderer.renderFrameOnRenderThread(EglRenderer.java:664)
at org.webrtc.EglRenderer.lambda$im8Sa54i366ODPy-soB9Bg4O-w4(Unknown Source:0)
at org.webrtc.-$$Lambda$EglRenderer$im8Sa54i366ODPy-soB9Bg4O-w4.run(Unknown Source:2)
at android.os.Handler.handleCallback(Handler.java:883)
at android.os.Handler.dispatchMessage(Handler.java:100)
at org.webrtc.EglRenderer$HandlerWithExceptionCallback.dispatchMessage(EglRenderer.java:103)
at android.os.Looper.loop(Looper.java:214)
at android.os.HandlerThread.run(HandlerThread.java:67
that happens with your dependency as well as with official current release (1.0.+), but I'm just glad I've found a space where I can post my problem and I do believe someone with knowledge will read this in here :)
so: is there a way to use WebRTC lib just for drawing YUV, as it contains all needed breadcrumbs for sure? Currently it looks to me like rendering part of code is tightened to protocol and API methods/ways for obtaining video (from stream or file, not from some custom callback). Maybe there is some sample somewhere showing built-in Camera preview rendered "manually", not by setPreviewSurface?
btw. more details and tries in my SO question, currently with bounty :)
Hey @snachmsm, I'm not sure if it's a problem with the SurfaceViewRenderer. How about using VideoTextureRenderer?
almost exacly same issue
E EglRenderer: java.lang.IllegalStateException: buffer is inaccessible
at java.nio.DirectByteBuffer.slice(DirectByteBuffer.java:159)
at org.webrtc.JavaI420Buffer.getDataY(JavaI420Buffer.java:118)
at org.webrtc.VideoFrameDrawer$YuvUploader.uploadFromBuffer(VideoFrameDrawer.java:114)
at org.webrtc.VideoFrameDrawer.drawFrame(VideoFrameDrawer.java:221)
at org.webrtc.EglRenderer.renderFrameOnRenderThread(EglRenderer.java:473)
at org.webrtc.EglRenderer.lambda$im8Sa54i366ODPy-soB9Bg4O-w4(Unknown Source:0)
at org.webrtc.-$$Lambda$EglRenderer$im8Sa54i366ODPy-soB9Bg4O-w4.run(Unknown Source:2)
at android.os.Handler.handleCallback(Handler.java:883)
at android.os.Handler.dispatchMessage(Handler.java:100)
at org.webrtc.EglRenderer$HandlerWithExceptionCallback.dispatchMessage(EglRenderer.java:595)
at android.os.Looper.loop(Looper.java:214)
at android.os.HandlerThread.run(HandlerThread.java:67)
E FATAL EXCEPTION: VideoTextureViewRenderer: EglRenderer
Process: thats.my.package, PID: 20632
java.lang.IllegalStateException: buffer is inaccessible
at java.nio.DirectByteBuffer.slice(DirectByteBuffer.java:159)
at org.webrtc.JavaI420Buffer.getDataY(JavaI420Buffer.java:118)
at org.webrtc.VideoFrameDrawer$YuvUploader.uploadFromBuffer(VideoFrameDrawer.java:114)
at org.webrtc.VideoFrameDrawer.drawFrame(VideoFrameDrawer.java:221)
at org.webrtc.EglRenderer.renderFrameOnRenderThread(EglRenderer.java:473)
at org.webrtc.EglRenderer.lambda$im8Sa54i366ODPy-soB9Bg4O-w4(Unknown Source:0)
at org.webrtc.-$$Lambda$EglRenderer$im8Sa54i366ODPy-soB9Bg4O-w4.run(Unknown Source:2)
at android.os.Handler.handleCallback(Handler.java:883)
at android.os.Handler.dispatchMessage(Handler.java:100)
at org.webrtc.EglRenderer$HandlerWithExceptionCallback.dispatchMessage(EglRenderer.java:595)
at android.os.Looper.loop(Looper.java:214)
at android.os.HandlerThread.run(HandlerThread.java:67)
I'm "feeling" like this problem is related to ByteBuffer, every time I'm getting new one in my callback, maybe these surface/texture Views (and further EGLRenderer) need one buffer feeded live? just a guess as I don't have literally any knowlednge about java.nio...
Note that I'm not using VideoSink etc. related classes, I'm strictly calling onFrame (both tried Views got this method)