api.video-reactnative-live-stream
api.video-reactnative-live-stream copied to clipboard
feat(*): add support for new architecture
Update to Fabric architecture and clean the project
Status: Waiting for test
What has been done?
- [x] iOS: both old and new arch
- [x] Android: both old and new arch
- [x] Android: manage permissions at low level (bugfix in progress)
- [x] Android preview: https://github.com/apivideo/api.video-reactnative-live-stream/issues/64
Not tested but it should fixed
- https://github.com/apivideo/api.video-reactnative-live-stream/issues/66
- https://github.com/apivideo/api.video-reactnative-live-stream/issues/65
What is missing?
- [ ] tests Android legacy arch
- [ ] tests Android new arch
- [ ] tests iOS legacy arch
- [ ] tests iOS new arch
TODO
- [ ] Issue on orientation when changing device from portrait to landscape
How can I help?
- By testing the example with one or several of the following case: Android old arch, Android new arch, iOS old arch, iOS new arch.
How to test?
- Directly by using the example: See https://github.com/apivideo/api.video-reactnative-live-stream?tab=readme-ov-file#example-app
- In your project: install this branch with: yarn add https://github.com/apivideo/api.video-reactnative-live-stream.git#feature/new_arch_support
Please write you have been testing 🙏
I am trying to fix this: https://github.com/apivideo/api.video-reactnative-live-stream/issues/64 for the release
Waiting for feedback/issue (bug only) from the community now. For issues, please answer here.
Test iOS and Android building with new arch branch, iOS works fine but android failed with compiling.
Environment:
"react-native": "0.72.10",
"@api.video/react-native-livestream": "apivideo/api.video-reactnative-live-stream#feature/new_arch_support",
Android: newArchEnabled=false
Error:
After run "yarn android" `> Configure project :api.video_react-native-livestream Project Directory: /Users/XXXX/Desktop/project-name-XXX/node_modules/@api.video/react-native-livestream/android 5 actionable tasks: 5 up-to-date
FAILURE: Build completed with 2 failures. 1: Task failed with an exception.
-
Where: Build file '/Users/XXXX/Desktop/project-name-XXX/node_modules/@api.video/react-native-livestream/android/build.gradle' line: 33
-
What went wrong: A problem occurred evaluating project ':api.video_react-native-livestream'.
/Users/XXXX/Desktop/project-name-XXX/node_modules/@api.video/react-native/package.json (No such file or directory)
2: Task failed with an exception.
- What went wrong: A problem occurred configuring project ':api.video_react-native-livestream'.
compileSdkVersion is not specified. Please add it to build.gradle
` It seems like it need to amend the path for package.json file for the first issue.
1: Task failed with an exception.
* Where: Build file '/Users/XXXX/Desktop/project-name-XXX/node_modules/@api.video/react-native-livestream/android/build.gradle' line: 33 * What went wrong: A problem occurred evaluating project ':api.video_react-native-livestream'.
/Users/XXXX/Desktop/project-name-XXX/node_modules/@api.video/react-native/package.json (No such file or directory)
/Users/XXXX/Desktop/project-name-XXX/node_modules/@api.video/react-native/package.json
this is a file from the react-native
package and not related to RN live stream, are you sure your package has been properly installed?
2: Task failed with an exception.
* What went wrong: A problem occurred configuring project ':api.video_react-native-livestream'.
compileSdkVersion is not specified. Please add it to build.gradle
` It seems like it need to amend the path for package.json file for the first issue.
Do you have compileSdkVersion
in your build.gradle?
Thanks for your feedback! These two error got fixed.
1: Task failed with an exception.
* Where: Build file '/Users/XXXX/Desktop/project-name-XXX/node_modules/@api.video/react-native-livestream/android/build.gradle' line: 33 * What went wrong: A problem occurred evaluating project ':api.video_react-native-livestream'.
/Users/XXXX/Desktop/project-name-XXX/node_modules/@api.video/react-native/package.json (No such file or directory)
/Users/XXXX/Desktop/project-name-XXX/node_modules/@api.video/react-native/package.json
this is a file from thereact-native
package and not related to RN live stream, are you sure your package has been properly installed?
Yes, yarn install works fine and I have tried to nuke node_modules and reinstall all packages. this file does not exist in my project after reinstall everything. in the line 28-30, it try to find react native/package.json file.
this projectDir in my project is /Users/username/XXXX/project-name/node_modules/@api.video/react-native-livestream/android. I got this error fixed by add one more /.. in line 33 after projectDir to change it -> file("$projectDir/../../../react-native/package.json")
2: Task failed with an exception.
* What went wrong: A problem occurred configuring project ':api.video_react-native-livestream'.
compileSdkVersion is not specified. Please add it to build.gradle
` It seems like it need to amend the path for package.json file for the first issue.
Do you have
compileSdkVersion
in your build.gradle?
yes, previous version point to 33, but this got fixed by upgrading RN to 0.73.4 and upgrade compileSdkVersion to 34.
Nice, thanks for the help 👍
Not sure where I got this build.gradle
from but I get a new one from the RN community that hopefully will resolve the issue you face.
Could you test https://github.com/apivideo/api.video-reactnative-live-stream/pull/67/commits/eb17ff95f011b8189d8d28e1f80ff73ecae7b5a8?
Thanks for your time and patience 🙏
Nice, thanks for the help 👍
Not sure where I got this
build.gradle
from but I get a new one from the RN community that hopefully will resolve the issue you face. Could you test eb17ff9?Thanks for your time and patience 🙏
Hi, tested new commit , they are working well 👍
Did you test new arch (Fabric) or the old architecture?
Expo demo project: https://github.com/BlueBazze/api.video-reactnative-live-stream-expo-demo The src folder is simply just the pr branch example with the only changes being the icons using expo icons instead.
Expo build ran without errors: https://expo.dev/accounts/tbj/projects/api-video-reactnative-live-stream-expo-demo/builds/8057d3ab-295a-44bf-b600-019760447dc0
Havent gotten to actually testing livestreaming yet, so this is only startup errors.
Android
When opening the element inspector, the preview turned black. The inspector itself worked. Dont think this is a real problem.
01 - Fixed
When i started the app it prompted for permissions which it was allowed. This may have been old arch. Then it threw the following error after a few seconds.
io.github.thibaultbee.streampack.error.StreamPackError: java.lang.IllegalArgumentException: Surface was abandoned
io.github.thibaultbee.streampack.streamers.bases.BaseCameraStreamer$startPreview$2.invokeSuspend(BaseCameraStreamer.kt:108)
kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106)
kotlinx.coroutines.EventLoopImplBase.processNextEvent(EventLoop.common.kt:280)
kotlinx.coroutines.BlockingCoroutine.joinBlocking(Builders.kt:85)
kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking(Builders.kt:59)
kotlinx.coroutines.BuildersKt.runBlocking(Unknown Source:1)
kotlinx.coroutines.BuildersKt__BuildersKt.runBlocking$default(Builders.kt:38)
kotlinx.coroutines.BuildersKt.runBlocking$default(Unknown Source:1)
io.github.thibaultbee.streampack.streamers.bases.BaseCameraStreamer.startPreview(BaseCameraStreamer.kt:101)
video.api.livestream.ApiVideoLiveStream$startPreview$1$1$1.invoke(ApiVideoLiveStream.kt:394)
video.api.livestream.ApiVideoLiveStream$startPreview$1$1$1.invoke(ApiVideoLiveStream.kt:389)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:71)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:68)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:38)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.PermissionsManager$requestPermissions$1.onAllGranted(PermissionsManager.kt:51)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:76)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:49)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.requestPermissions$lambda$1(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.$r8$lambda$dzig6asy22NzT_1QrKBawhgieas(Unknown Source:0)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$$ExternalSyntheticLambda1.run(Unknown Source:10)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
java.lang.Thread.run(Thread.java:1012)
Caused by java.lang.IllegalArgumentException: Surface was abandoned
android.hardware.camera2.utils.SurfaceUtils.getSurfaceSize(SurfaceUtils.java:135)
android.hardware.camera2.params.OutputConfiguration.<init>(OutputConfiguration.java:615)
android.hardware.camera2.params.OutputConfiguration.<init>(OutputConfiguration.java:325)
io.github.thibaultbee.streampack.internal.sources.camera.CameraController.createCaptureSession(CameraController.kt:137)
io.github.thibaultbee.streampack.internal.sources.camera.CameraController.access$createCaptureSession(CameraController.kt:36)
io.github.thibaultbee.streampack.internal.sources.camera.CameraController$startCamera$3.invokeSuspend(CameraController.kt:182)
kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:108)
kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:584)
kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:793)
kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:697)
kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:684)
Tried restarting the app a few times, the error kept occurring. So i gave it permission for notifications, then it worked without the error, removed that permission again and it continued to not throw this error.
02 - Fixed
I havent found a way to reproduce this consistently. Sometimes when opening the dev app, this error is thrown on startup causing the app to crash. Restarting the app will make the error go away.
android.view.ViewRootImpl$CalledFromWrongThreadException: Only the original thread that created a view hierarchy can touch its views.
android.view.ViewRootImpl.checkThread(ViewRootImpl.java:11586)
android.view.ViewRootImpl.requestLayout(ViewRootImpl.java:2648)
android.view.View.requestLayout(View.java:27623)
android.view.View.requestLayout(View.java:27623)
android.view.View.requestLayout(View.java:27623)
android.view.View.requestLayout(View.java:27623)
android.view.View.requestLayout(View.java:27623)
android.view.View.requestLayout(View.java:27623)
android.view.View.requestLayout(View.java:27623)
android.view.View.requestLayout(View.java:27623)
androidx.constraintlayout.widget.ConstraintLayout.requestLayout(ConstraintLayout.java:3605)
android.view.View.requestLayout(View.java:27623)
androidx.constraintlayout.widget.ConstraintLayout.requestLayout(ConstraintLayout.java:3605)
android.view.View.requestLayout(View.java:27623)
android.view.SurfaceView$1.setFixedSize(SurfaceView.java:1673)
io.github.thibaultbee.streampack.views.AutoFitSurfaceView.setAspectRatio(AutoFitSurfaceView.kt:47)
video.api.livestream.ApiVideoLiveStream.startPreview(ApiVideoLiveStream.kt:385)
video.api.livestream.ApiVideoLiveStream.setVideoConfig(ApiVideoLiveStream.kt:97)
video.api.reactnative.livestream.LiveStreamView$videoConfig$1.invoke(LiveStreamView.kt:129)
video.api.reactnative.livestream.LiveStreamView$videoConfig$1.invoke(LiveStreamView.kt:126)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermission$request$1$1.invoke(SerialPermissionsManager.kt:66)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermission$request$1$1.invoke(SerialPermissionsManager.kt:63)
video.api.reactnative.livestream.utils.permissions.PermissionsManager$requestPermission$1.onAllGranted(PermissionsManager.kt:104)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:76)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermission(PermissionsManager.kt:102)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.requestPermission$lambda$3(SerialPermissionsManager.kt:63)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.$r8$lambda$Z-0SVJvo1Foa5ZS9Ynby6y7UM5k(Unknown Source:0)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$$ExternalSyntheticLambda0.run(Unknown Source:10)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
java.lang.Thread.run(Thread.java:1012)
03
The specific case is not reproducible in a production build. But writing it down anyway. Reproduce: Open dev menu and toggle inspector, enabling the touchables, open the dev menu again and reload the app.
io.github.thibaultbee.streampack.error.StreamPackError: java.lang.NullPointerException: Attempt to invoke virtual method 'int android.media.audiofx.AcousticEchoCanceler.setEnabled(boolean)' on a null object reference
io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:213)
video.api.livestream.ApiVideoLiveStream$audioConfig$2.invoke(ApiVideoLiveStream.kt:89)
video.api.livestream.ApiVideoLiveStream$audioConfig$2.invoke(ApiVideoLiveStream.kt:84)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:72)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:69)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:38)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.PermissionsManager$requestPermissions$1.onAllGranted(PermissionsManager.kt:51)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:76)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:49)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.requestPermissions$lambda$1(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.$r8$lambda$dzig6asy22NzT_1QrKBawhgieas(Unknown Source:0)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$$ExternalSyntheticLambda1.run(Unknown Source:10)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
java.lang.Thread.run(Thread.java:1012)
Caused by java.lang.NullPointerException: Attempt to invoke virtual method 'int android.media.audiofx.AcousticEchoCanceler.setEnabled(boolean)' on a null object reference
io.github.thibaultbee.streampack.internal.sources.AudioSource.configure(AudioSource.kt:63)
io.github.thibaultbee.streampack.internal.sources.AudioSource.configure(AudioSource.kt:33)
io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:206)
video.api.livestream.ApiVideoLiveStream$audioConfig$2.invoke(ApiVideoLiveStream.kt:89)
video.api.livestream.ApiVideoLiveStream$audioConfig$2.invoke(ApiVideoLiveStream.kt:84)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:72)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:69)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:38)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.PermissionsManager$requestPermissions$1.onAllGranted(PermissionsManager.kt:51)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:76)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:49)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.requestPermissions$lambda$1(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.$r8$lambda$dzig6asy22NzT_1QrKBawhgieas(Unknown Source:0)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$$ExternalSyntheticLambda1.run(Unknown Source:10)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
java.lang.Thread.run(Thread.java:1012)
iOS
Seems to work without problems, did make sure that the new arch is enabled.
Did you test new arch (Fabric) or the old architecture?
old architecture
Expo demo project: https://github.com/BlueBazze/api.video-reactnative-live-stream-expo-demo The src folder is simply just the pr branch example with the only changes being the icons using expo icons instead. ...
Seems to work without problems, did make sure that the new arch is enabled.
I am working on this in the native dependency: https://github.com/apivideo/api.video-android-live-stream Could you test the last dev on this branch: https://github.com/apivideo/api.video-reactnative-live-stream/tree/bugfix/android_preview?
Expo demo project: https://github.com/BlueBazze/api.video-reactnative-live-stream-expo-demo The src folder is simply just the pr branch example with the only changes being the icons using expo icons instead. ... Seems to work without problems, did make sure that the new arch is enabled.
I am working on this in the native dependency: https://github.com/apivideo/api.video-android-live-stream Could you test the last dev on this branch: https://github.com/apivideo/api.video-reactnative-live-stream/tree/bugfix/android_preview?
I've started a build with that branch. Its gonna take about 2-3 hours for the build to finish. Ill test it tonight and update this comment with the results.
https://expo.dev/accounts/tbj/projects/api-video-reactnative-live-stream-expo-demo/builds/2572c89d-0289-4394-b1c7-99b95565d974
No errors were thrown while booting the demo app. So i toyed around trying to find any error.
When i opened the element inspector, the preview turned black. The inspector itself worked. Dont think this is a problem.
I did manage to somehow get this error. The specific case is not reproducible in a production build. But writing it down anyway. Reproduce: Open dev menu and toggle inspector, enabling the touchables, open the dev menu again and reload the app.
io.github.thibaultbee.streampack.error.StreamPackError: java.lang.NullPointerException: Attempt to invoke virtual method 'int android.media.audiofx.AcousticEchoCanceler.setEnabled(boolean)' on a null object reference
io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:213)
video.api.livestream.ApiVideoLiveStream$audioConfig$2.invoke(ApiVideoLiveStream.kt:89)
video.api.livestream.ApiVideoLiveStream$audioConfig$2.invoke(ApiVideoLiveStream.kt:84)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:72)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:69)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:38)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.PermissionsManager$requestPermissions$1.onAllGranted(PermissionsManager.kt:51)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:76)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:49)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.requestPermissions$lambda$1(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.$r8$lambda$dzig6asy22NzT_1QrKBawhgieas(Unknown Source:0)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$$ExternalSyntheticLambda1.run(Unknown Source:10)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
java.lang.Thread.run(Thread.java:1012)
Caused by java.lang.NullPointerException: Attempt to invoke virtual method 'int android.media.audiofx.AcousticEchoCanceler.setEnabled(boolean)' on a null object reference
io.github.thibaultbee.streampack.internal.sources.AudioSource.configure(AudioSource.kt:63)
io.github.thibaultbee.streampack.internal.sources.AudioSource.configure(AudioSource.kt:33)
io.github.thibaultbee.streampack.streamers.bases.BaseStreamer.configure(BaseStreamer.kt:206)
video.api.livestream.ApiVideoLiveStream$audioConfig$2.invoke(ApiVideoLiveStream.kt:89)
video.api.livestream.ApiVideoLiveStream$audioConfig$2.invoke(ApiVideoLiveStream.kt:84)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:72)
video.api.reactnative.livestream.LiveStreamView$1$1.invoke(LiveStreamView.kt:69)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:38)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$requestPermissions$request$1$1.invoke(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.PermissionsManager$requestPermissions$1.onAllGranted(PermissionsManager.kt:51)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:76)
video.api.reactnative.livestream.utils.permissions.PermissionsManager.requestPermissions(PermissionsManager.kt:49)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.requestPermissions$lambda$1(SerialPermissionsManager.kt:35)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager.$r8$lambda$dzig6asy22NzT_1QrKBawhgieas(Unknown Source:0)
video.api.reactnative.livestream.utils.permissions.SerialPermissionsManager$$ExternalSyntheticLambda1.run(Unknown Source:10)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:644)
java.lang.Thread.run(Thread.java:1012)
Did you test new arch (Fabric) or the old architecture?
Hey there, I'm a colleague of @Veryinheart. Firstly, thank you very much for your work on this project! We were having a hard time getting other libs to work in both android and ios, plus dealing with incompatible react-native versions. You've solved all of that for us, so we're very grateful.
We are using the old architecture. We're currently doing some device testing. We'll let you know how it goes.
@ThibaultBee Sorry that i am so unreliable..
I have tested android with the bugfix branch you made. Only found one problem. I've added the new error to my original comment and marked the first two as fixed
Nice. Thanks.
So we are getting closer to a release :)
Are you running on an emulator?
Are you running on an emulator?
No, i dont use emulators for any kind of testing. Tested on a Samsung Galaxy A22 5G
Ill begin more testing on iPhone, using iPhone 8.
Then afterwards ill start actually livestreaming.
Btw, regarding iOS support. Not sure what haishinkit currently minimum supports. But it could probably be worth it to have a docs file describing why the lowest supported version of iOS is at X. And also what is being handled differently on each version. If it is HaishinKit or if it is react native or whatever. I know that when i was maintaining my implementation of haishinkit i had to make differences between iOS 10 and 11. The most recent chart i know of is https://www.statista.com/chart/5824/ios-iphone-compatibility/ I know a lot of our customers use iPhone 8 and X.
Dont know if this is worth even writing down. But i think it could be useful for new developers.
Nice, you found one issue. We shall support iOS from 12.0.
@BlueBazze
Your issue happens here: https://github.com/ThibaultBee/StreamPack/blob/994b995705e1566ccb2a18d848ad907e4afaebdb/core/src/main/java/io/github/thibaultbee/streampack/internal/sources/AudioSource.kt#L63
I guess we should check that create returns a null
object. But it is odd that isAvailable
is true
in this case. I will make the change but this will be released later.
I am currently releasing the Android dependency to integrate the fixes on the preview in this branch.
Are you going to perform other tests? Not sure if you said it but have you tested on new arch or on old arch?
@Veryinheart @matthewfleming How long do you need to test this PR?
Are you going to perform other tests?
Yes, i havent started actually streaming yet.
Not sure if you said it but have you tested on new arch or on old arch?
The very first one was with the old arch i believe. But it should be the new one. I specified expo to use the new arch: https://github.com/BlueBazze/api.video-reactnative-live-stream-expo-demo/blob/master/app.json#L45-L50 But i dont know how it works under the hood and how it knows which arch to use from the package.
Is there a way to check which arch it is running on from the running code?
But i dont know how it works under the hood and how it knows which arch to use from the package.
This package is supposed to support both arch on iOS and Android.
Is there a way to check which arch it is running on from the running code?
Not sure... There might be a log somewhere.
@BlueBazze
Your issue happens here: https://github.com/ThibaultBee/StreamPack/blob/994b995705e1566ccb2a18d848ad907e4afaebdb/core/src/main/java/io/github/thibaultbee/streampack/internal/sources/AudioSource.kt#L63 I guess we should check that create returns a
null
object. But it is odd thatisAvailable
istrue
in this case. I will make the change but this will be released later.I am currently releasing the Android dependency to integrate the fixes on the preview in this branch.
Are you going to perform other tests? Not sure if you said it but have you tested on new arch or on old arch?
@Veryinheart @matthewfleming How long do you need to test this PR?
Hi @ThibaultBee , we have tested on some devices and below is the result:
IOS: iPhone 14 Pro - iOS 16.2 works well iPhone X - iOS 15.5 works well iPhone 7 -iOS 14.7. works well
Android: Samsung galaxy S9 - android 10 works well. Live streaming dropped once and restarted it and everything is ok, so might be the network issue. Samsung galaxy S7 - works well Pixel 6a - android 14- works well
Question: Do we support live stream in background mode? live stream will be stopped if we move app to background mode when it is live streaming.
Question: Do we support live stream in background mode? live stream will be stopped if we move app to background mode when it is live streaming.
Yes, this is on purpose. I thought using cameras in background was not legit for Android but I am not sure anymore. But we will have to use a backgroud service and it is not something we want to do. Moreover, it is the same behavior as camera applications.
I've tested streaming on Iphone 8 and Samsung A22, couldn't find any problems. Havent tested orientation change, ill get to it as soon as possible.
Yes, this is on purpose. I thought using cameras in background was not legit for Android but I am not sure anymore. But we will have to use a backgroud service and it is not something we want to do. Moreover, it is the same behavior as camera applications.
When i looked into this, the easiest way was to use PiP. Back then i found this https://github.com/adkaushik/react-native-pip-android But havent seen any other packages allowing this functionality, and i've thought of it as too big a hassle. So if it creates another component specifically for the pip view, then react would render that as a seperate native component, and it would not contain the same data and state as the original stream view. Causing the pip view not to be live. Otherwise the entire stream functionality would have to be packed into a singleton on the native side and each stream view would just be using the singleton information. Meaning only one stream could be active at any one point.
But yes. Older people (and even some younger) will not know why their stream stopped just because they tapped away from the application. What i have done is to display an in-app notification telling the user their stream has stopped, even if they stop streaming themselves..
Orientation
Only tested on android.
The phone was rotated while the app was running before going live.
I restarted the app in landscape.
That's going to be a tricky one.
If you remove orientation
in android:configChanges
that might fix the issue.
On native Android application, the activity is recreated when the orientation of the device changes. This is not the case in RN application (not sure why).
But this is definitely something that should be handle by the library.
If you remove
orientation
inandroid:configChanges
that might fix the issue.
I dont believe it is present. With default expo the only app config available is https://docs.expo.dev/versions/latest/config/app/#orientation
Just to add on the orientation. It works fine with the current master version as far as i know.
Listening to orientation change with an event listener and keep a state with the current orientation. When pressing go live, lock the app orientation to the current. When going offline you unlock the orientation.
This has worked great so far. But if the outgoing stream is not oriented according to the preview the user would not know until looking at the viewer perspective.
Your orientation setting is not the same as orientation in configChanges
.
See https://github.com/apivideo/api.video-reactnative-live-stream/blob/aa6138bb8f30abed14fa70b8d66516b267a98398/example/android/app/src/main/AndroidManifest.xml#L17
I need to dig a bit deeper in this issue.