Agora-Flutter-SDK
Agora-Flutter-SDK copied to clipboard
Using setExternalVideoSource method to send a DeepAR source to Agora
**Is your feature request related to a problem?
Thanks for the Flutter support of Agora.
I need to implement AR filters in Video calls. And from research, this can be done by accessing the native layer of Android / iOS. Im new to flutter Plugin development. can you suggest me pointers on how this can be achieved via the flutter SDK?
FYI: The DeepAR SDK can be integrated into the Agora Native SDK by pushing the frame to agora's pushExternalVideoFrame method.
You can refer to https://github.com/AgoraIO/Agora-Flutter-SDK/blob/master/example/android/app/src/main/kotlin/io/agora/agora_rtc_engine_example/custom_capture_audio/CustomCaptureAudioPlugin.kt
Thanks for the reply @LichKing-2234
After Analyzing the DepAR, It does not provide the processed video frame that can be pushed to Agora, instead it sets the processed frame as a surfaceView. Is there any way I can push the surface texture to Agora.
So instead of the Agora's RtcLocalView, I will use a surfaceview to show camera frames and send the frames to deepAR for processing. Once processed, the deepAR updates the surfaceView.
Is there a way to pull this off?
@Anilkumar18 I think you can refer to our APIExample of ARCore, seems they do something similar. https://github.com/AgoraIO/API-Examples/blob/master/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ARCore.java
@littleGnAl I looked at the Example. Since it's a native implementation, they are creating the surface view and adding the render from DeepAr directly to the surface view and it is available to be viewed on the screen. But in Flutter's case, I don't have the access to set or get the surface view from/to RTCLocalView.
Is there anything I can do to get access to the RTCLocalView frames
I think you should implement the flutter PlatformView or Texture widget yourself to render it.
so should I create a platform view plugin in the Agora Flutter plugin Android code? Or Can you give me any pointers on where should I create this Platform view, inside the Agora flutter SDK Android side or on the example app side?
@littleGnAl I went on to create a Platformview implementation on the native side. And I implemented the RTCEnginePlugin on my platform view so that I can get the RtcEngine Instance. But the RTCEngine does not fire the onRtcEngineCreated(@Nullable RtcEngine rtcEngine) even when I tried to create the RtcEngine in Flutter side before calling the code. Is there any particular case when the onRtcEngineCreated() implement fails to fire.
Here are some steps you can reference, but not practiced:
- Make sure your implementation of DeepAR works fine on the native side(Android/iOS)
- Follow the official doc to implement the PlatformView to render
GLSurfaceView
https://docs.flutter.dev/development/platform-integration/platform-views - Implement the
RtcEnginePlugin
to handlesetExternalVideoSource
logic- APIExample of ARCore: https://github.com/AgoraIO/API-Examples/blob/master/Android/APIExample/app/src/main/java/io/agora/api/example/examples/advanced/ARCore.java
-
RtcEnginePlugin
example: https://github.com/AgoraIO/Agora-Flutter-SDK/blob/master/example/lib/examples/advanced/custom_capture_audio/custom_capture_audio.dart
I think all the work can be decupled rather than modified the agora_rtc_engine
codes.
But the RTCEngine does not fire the onRtcEngineCreated(https://github.com/nullable RtcEngine rtcEngine) even when I tried to create the RtcEngine in Flutter side before calling the code. Is there any particular case when the onRtcEngineCreated() implement fails to fire.
Make sure the RtcEnginePlugin
has been registered
https://github.com/AgoraIO/Agora-Flutter-SDK/blob/2f3cf423f2439a3923bf308b6a155b77b2b033c9/example/android/app/src/main/kotlin/io/agora/agora_rtc_engine_example/MainActivity.kt#L19
Thanks @littleGnAl
I was able to create the native implementation for DeepAR in Android. Is it possible to create the platformview plugin using the pigeon library. Or should I use the method channel and register it in the activity?
I tried implementing the platform view using method channels but the rtcEnginePlugin was not registered properly and I wasn't able to get the RtcEngine instance inside the platform view code.
Is it possible to create the platformview plugin using the pigeon library.
Yes, but you should make sure you understand how it works, and I think MethodChannel is more simply to verify your implementation at this time. You can put the MethodCannel anywhere, It just depends on what the MethodChannel does.
I tried implementing the platform view using method channels but the rtcEnginePlugin was not registered properly and I wasn't able to get the RtcEngine instance inside the platform view code.
You should have a way to store the RtcEngine
, then you can use it in other places, but you should make sure it will not be leaked. Or you can directly implement the RtcEnginePlugin
interface for your PlatformView
, and register/unregister it according to the PlatformView
lifecycle.
Thanks @littleGnAl
I am able to create the platform view with method channel and verified that deepAR is working is expected in the native Android side.
But I am facing an issue with pushing the frames to the agora. I'm creating an RTCEngine on the flutter side whenever I'm opening the deepAR platform view. If I set the enableVideo() to rtcEngine on the flutter side, the camera is opened and the frames are pushed to the agora. But in my case, I'm also initializing the camera provider in the native platform side inside DeepAR code. This results in accessing the camera twice which stops native side from capturing camera stream. hence only unprocessed videos are pushed to the agora from fljutter side.
So what I did was I didn't call enableVideo() from the flutter side , instead the enableVideo() function on the rtcEngine reference on native side. This fixed the cameracaptureprovider issue. and the frames are being pushed to Agora, And this works perfectly for the first time and deepAR processed frames are pushed to Agora, but if I close and open the platformview screen again, the deepAR works fine but there is no video in the receiving end. Is this related RTCEngine enablevideo()? How can I enableVIdeo from the Nativeside?
My code if needed for reference https://github.com/Anilkumar18/Agora-Flutter-SDK.git
But in my case, I'm also initializing the camera provider in the native platform side inside DeepAR code.
Did you call setExternalVideoSource
function?
but if I close and open the platformview screen again, the deepAR works fine but there is no video in the receiving end
It's most likely a lifecycle issue, make sure you can get the RtcEngine
correctly.
@littleGnAl
Yes I've set the setExternalVideoSource function. And Im not getting RtcEngine initialized function inside my platform view. I can only get the RtcEngine inside the plugin register view factory and I'm sending it as parameter while creating the platform view. I think this may be the issue, is there a way to rtcEngineInitialized event directly in Platform view? Or how can be done using the platformviewfactory?
Maybe you should check the result of pushExternalVideoFrame
function, to see if success or not.
Maybe you should check the result of
pushExternalVideoFrame
function, to see if success or not.
You are right, its false.
There is an instance RtcEngine, but the pushexternalvideo frame function failed. is this related to using an old instance of RtcEngine the second time? Is there any way to check the RTcEngine instance is the newest one?
@littleGnAl I tried to check if the RTcEngine I get is the same instance or different. While logging the RtcEngine reference first and second time, the reference was completely different. So I assume that the RtcEngine is created correctly, If so what others reason would be there for not pushing frames the second time? I already checked setting setExternalVideoSource to true. Is there anything different in play here?
Reference the API doc, you should make sure the setExternalVideoSource
is called before the pushExternalVideoFrame
https://docs.agora.io/en/Video/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_rtc_engine.html#a6e7327f4449800a2c2ddc200eb2c0386
https://docs.agora.io/en/Video/API%20Reference/java/classio_1_1agora_1_1rtc_1_1_rtc_engine.html#a2d9966c52798ab62ed941fa865e926cd
Thanks a lot @littleGnAl I was able to fix the issue. The issue was that the platformview are registered only once in flutter. But I was relying on this to register every time the RtcEngine is created, since Im sending the rtcEngine reference everytime it is registered. I was able to set the rtcEngine directly when the platform does not register it. And i have access to the latest RTcEngine reference and able to push the frames. I know this seems like an workaround. But it works for now, until I get to use pigeon instead of method channel
and thanks @LichKing-2234 for pointing me to the right direction in the start
@Anilkumar18 You're welcome. BTW I still suggest you decouple your implementation from the agora_rtc_engine
plugin, and create it as a flutter plugin, this will help you maintain your code more easily, and you can easy to share your plugin with others if you want.
I'm not sure how to do that yet @littleGnAl, Right now I rely on the RtcEnginePlugin to do the job, and I don't have any idea how to do video call without implementing it
HI @Anilkumar18 ,
Thank you for your work! successfully implemented this thanks to you.
Now I am considering improving my app performance. and I came to cross this: Agora Raw data plugin. https://pub.dev/packages/agora_rtc_rawdata
Hi @littleGnAl,
Would you be able to guide us regarding this?
If you can program with C++, you should process raw data on the C++ layer to improve performance and remove code about calling Android and iOS.
Thank you in advance.
HI @Anilkumar18 ,
Thank you for your work! successfully implemented this thanks to you.
Now I am considering improving my app performance. and I came to cross this: Agora Raw data plugin. https://pub.dev/packages/agora_rtc_rawdata
Hi @littleGnAl,
Would you be able to guide us regarding this?
If you can program with C++, you should process raw data on the C++ layer to improve performance and remove code about calling Android and iOS.
Thank you in advance.
Hi @KalanaPerera
Glad you were able to implement this in your app. I have completed the Android part implementation as mentioned above, but I'm not an IOS dev, so I'm struggling with the iOS implementation of the same. Can you guide me on how you implemented this in your app
Hello @Anilkumar18 . could you please share your android implementation so it will be helpful for me to achieve this my side in flutter android.
Hello @Anilkumar18 . could you please share your android implementation so it will be helpful for me to achieve this my side in flutter android.
Share your email. I'll mail the link
Hello @Anilkumar18 , could you please share your link for android implementation?
According to the documentation, version 4 of the plugin allows the use of setExternalVideoSource: https://docs.agora.io/en/video-calling/develop/custom-video-and-audio?platform=flutter
A platform view has been registered and is working when i use
AndroidVIew(viewType: 'OpenGLDisplayView')
But it doesn't work when i initiate an observer like this:
var handle = await engine.getNativeHandle();
await AgoraRtcRawdata.registerVideoFrameObserver(handle);
Here is configuration of PlatformView in .kt file:
mGLDisplayViewFactory = GLDisplayViewFactory(messenger) binding.platformViewRegistry .registerViewFactory("OpenGLDisplayView", mGLDisplayViewFactory)
NOTE: I am performing this configuration in a local package which i am accessing via path. Do i need to change my mainActivity.java ?
How to Actively use 'OpenGLDisplayView' as a custom video stream?
agora_rtc_rawdata: ^0.1.0 agora_rtc_engine: ^6.1.1
Hello @Anilkumar18 . could you please share your android implementation so it will be helpful for me to achieve this my side in flutter android.
Share your email. I'll mail the link
Hey @Anilkumar18, would be helpful to take a look at this too :). Particularly having some trouble with the camera portion as I'm not super familiar with the Android APIs and I keep on finding wildly different implementations to access camera stuff. Would appreciate you reaching out on [email protected] :)