StreamPack icon indicating copy to clipboard operation
StreamPack copied to clipboard

[Feat]: Stream signal via type C/OTG cable

Open cryinrain69 opened this issue 10 months ago • 33 comments

Version

2.6.1

Environment that reproduces the issue

Samsung Galaxy S22 Ultra - Android 14

Use case description

Currently I know the library supports streaming through the camera and recording the device's screen. In case I want to get the input signal from an external camera connected via a USB Type C cable, is that possible? More broadly, the signal could be from another source such as a digital TV receiver, or another device that uses an HDMI to type C converter cable to connect to the phone.

Proposed solution

No response

Alternative solutions

No response

cryinrain69 avatar Feb 18 '25 15:02 cryinrain69

UVC camera is something I want to work on but I am really busy on other subjects. You are asking a lot of features/questions, it might be time to start considering sponsorship or/and contributions to help the development of this library.

ThibaultBee avatar Feb 18 '25 17:02 ThibaultBee

I agree.

I am currently using this library for experimental development, not yet used for any released project. And during this testing process, I found that it will need a lot of more advanced features to be able to meet the requirements for a complete project (eg: Using external camera sources, adding effects, filters, icons, text, streaming from files,... like some other developed libraries).

With the basic features as they are now, I find the quality of the library quite good, with a lot of potential.

In addition to the basic features as they are now, if your library has many other advanced features, it is completely possible to commercialize it. And I think everyone will welcome that.

As for the issue of sponsorship, donations it is completely reasonable for any stage of development of the library 👍

cryinrain69 avatar Feb 19 '25 02:02 cryinrain69

This library is quite young compare to others. I already have the filter you quote in mind. Most of them are quite easy to add from 3.X. It is just a matter of time.

ThibaultBee avatar Feb 23 '25 18:02 ThibaultBee

@ThibaultBee UVC Camera is also a feature I'm looking to integrate. Can you integrate it into v3? Or can you give some suggestions on how to do it? I found some UVC Camera libraries, it seems suitable. https://github.com/jiangdongguo/AndroidUSBCamera https://github.com/saki4510t/UVCCamera https://github.com/saki4510t/OpenCVwithUVC

Rider02vn avatar Mar 04 '25 16:03 Rider02vn

Can you integrate it into v3?

It is not what I have planned right now.

Or can you give some suggestions on how to do it?

If you want to be involved in the developement, you are welcome. You could also sponsor the project.

  • Find a proper UVC library (or do we need a library?):
    • it must be maintained
    • it must be possible to pass a Surface
    • pass my validation 👍

For any dev, use branch dev_v3.

In Extensions, add a uvccamera module. It must be dependent from StreamPack core. The UVCCameraSource implements interfaces: ICameraSourceInternal and ISurfaceSource.

As a matter of fact, you can already create your own video sources in your application. In 2.6.X, you have to implement: IVideoSource.

ThibaultBee avatar Mar 04 '25 20:03 ThibaultBee

I will look into this feature and see how it goes. Thank you.

Rider02vn avatar Mar 06 '25 03:03 Rider02vn

Also, I would like to ask when do you plan to release v3 and what notable features will it have?

Rider02vn avatar Mar 06 '25 03:03 Rider02vn

I will look into this feature and see how it goes. Thank you.

Hi I have now got a pretty good quality signal from UVC. But I don't know how to use it in StreamPack. Can I send you the demo source code so you can integrate it help me? (You can leave it as a test dev branch)

Rider02vn avatar Mar 06 '25 07:03 Rider02vn

Also, I would like to ask when do you plan to release v3 and what notable features will it have?

See bulletpoints in: https://github.com/ThibaultBee/StreamPack/discussions/87

Can I send you the demo source code so you can integrate it help me?

You can but I have other priorities right now.

ThibaultBee avatar Mar 06 '25 08:03 ThibaultBee

Hi. I'm currently using this UVC lib: shiyinghan/UVCAndroid I've created a UVCCameraSource class in core/internal/sources as shown in the image below (using v2.6.1) But I don't know how to use it: switch to preview signal between camera (SRT, RTMP streamer) and UVC, push it to stream,... Can you guide me the next steps to implement it? I will be very grateful to you.

Class UVCCameraSource:

UVCCameraSource.kt.txt

Image

Image

Rider02vn avatar Mar 07 '25 10:03 Rider02vn

@ThibaultBee Can u help me ? :-(

Rider02vn avatar Mar 10 '25 02:03 Rider02vn

Hi,

Why have you choosen this library over the previous ones?

But I don't know how to use it: switch to preview signal between camera (SRT, RTMP streamer) and UVC, push it to stream,...

What are you trying to achieve exactly?

@ThibaultBee Can u help me ? :-(

I am working on my spare time on this library, thus I can't be fast. Also, I have got other priorities right now. My focus is to finish the 3.0 to finally release it.

ThibaultBee avatar Mar 10 '25 09:03 ThibaultBee

Hi

Why have you choosen this library over the previous ones? The reason I chose that library is because:

  1. It is still updated recently.
  2. When I tested it, I found it very easy to use and gave quite good quality.
  3. It is also used by another livestream library (RootEncoder) to integrate UVC Camera. Demo here

What are you trying to achieve exactly?

I am following your instructions, which is to create a UVCCameraSource class (like the image/file I sent above). I also refer to the RootEncoder library about integrating that UVC library to apply to your library, currently I have also got the signal from UVC Camera and displayed its preview (based on AspectRatioSurfaceView). But I don't know how to push it to the livestream server. That's what I want to do.

Rider02vn avatar Mar 10 '25 09:03 Rider02vn

Then, have a look on a streamer like the CameraRtmpLiveStreamer.

For the live stream, you should set the encoderSurface (v2.X) (outputSurface (v3.X)) instead of the previewSurface.

Once again, you should work on dev_v3.

ThibaultBee avatar Mar 10 '25 09:03 ThibaultBee

Once again, you should work on dev_v3.

I haven't had much time to get familiar with the dev_v3 branch. So it will be easier for me to approach v2. When it is implemented, I will gradually move it to dev_v3.

I am still using CameraSrtLiveStreamer for now. With the phone device camera, it is working fine. But how will I use the UVCCameraSource class above? Can you please explain details?

Rider02vn avatar Mar 10 '25 10:03 Rider02vn

Have a look at CameraSrtLiveStreamer hierarchy like BaseCameraLiveStreamer and BaseCameraStreamer

ThibaultBee avatar Mar 10 '25 11:03 ThibaultBee

Have a look at CameraSrtLiveStreamer hierarchy like BaseCameraLiveStreamer and BaseCameraStreamer

I have seen and created the UVCCameraSrtLiveStreamer class and subclasses like CameraSrtLiveStreamer. In the **UVCCameraSource** class, fun startStream I still don't know what to do in it, how to use **encoderSurface**? I have seen CameraSource and ScreenSource, fun startStream uses encoderSurface.

Image

Image

Image

Image

Rider02vn avatar Mar 10 '25 14:03 Rider02vn

@ThibaultBee Hi Can you guide me through the next steps ? I'm really stuck :-(

Rider02vn avatar Mar 11 '25 08:03 Rider02vn

@ThibaultBee Hi Can you guide me through the next steps ? I'm really stuck :-(

@Rider02vn Have you solved this problem? I see that RootEncoder library also implements a class called CameraUvcSource, which seems to be similar to what StreamPack is implementing but is currently working. There is one problem that RootEncoder library has a livestream error with no sound with SRT protocol (with RTMP works fine). This StreamPack works fine with both SRT and RTMP. That's why I'm choosing StreamPack for this SRT protocol.

I also need this feature. @ThibaultBee Can you help us?. Please

cryinrain69 avatar Mar 11 '25 15:03 cryinrain69

Hi. I still can't do it.

I have now opened the usb camera and got the signal to display on previewSurface. The problem now is to push that signal to the livestream.

The problem is probably in the startStream() function in UVCCameraSource class, we will need to do something with the encoderSurface in that function. Unfortunately, I don't have much experience in handling this problem. I feel like it's just a little bit closer to completion but still can't do it. Maybe only @ThibaultBee can help us at this point. Please.

Image

Rider02vn avatar Mar 11 '25 16:03 Rider02vn

The encoderSurface is supposed to be set by the streamer. Have you looked for encoderSurface in the project? I haven't worked on v2.6 since a long time, I don't remember few parts.

ThibaultBee avatar Mar 11 '25 20:03 ThibaultBee

The encoderSurface is supposed to be set by the streamer. Have you looked for encoderSurface in the project? I haven't worked on v2.6 since a long time, I don't remember few parts.

Hi. If I switch to the dev_v3_1 branch now, will it be easier to integrate UVC? And will it take a long time to convert code using the library from v2.6.1 to dev_v3_1? Please.

Rider02vn avatar Mar 12 '25 02:03 Rider02vn

The encoderSurface is supposed to be set by the streamer. Have you looked for encoderSurface in the project? I haven't worked on v2.6 since a long time, I don't remember few parts.

I have searched for encoderSurface but I don't know how to handle it with UVC Camera.

  • With CameraSource, encoderSurface is handled with CameraController (using the device's camera).
  • With ScreenSource, it is handled with MediaProjection Like the picture I took in the comment above #2710814212. I am quite confused and don't know how to handle encoderSurface with UVC Camera. Can you take a moment to look at the UVCCameraSource class attached above and tell me the flow that will handle encoderSurface?

Rider02vn avatar Mar 12 '25 03:03 Rider02vn

If I switch to the dev_v3_1 branch now, will it be easier to integrate UVC?

Prefer to use dev_v3 as dev_v3_1 is still instable.

And will it take a long time to convert code using the library from v2.6.1 to dev_v3_1?

Hmm that is difficult to say. The source implementation is almost the same as in v2.6.1. At some point, you will have to go to v3, so I think it is pointless to work on v2.6.X.

I am quite confused and don't know how to handle encoderSurface with UVC Camera.

Pass the encoderSurface to the UVC Camera helper.

ThibaultBee avatar Mar 12 '25 08:03 ThibaultBee

Hi. Awesome. I got it working. Thank you very much. I will be moving to dev_v3 soon

Rider02vn avatar Mar 12 '25 09:03 Rider02vn

FYI, dev_v3_1 will be rebase in dev_v3 soon.

Features of dev_v3_1 are:

  • independant output (for live stream and record for example)
  • dynamic input (for switching to another video or audio source)

ThibaultBee avatar Mar 12 '25 14:03 ThibaultBee

Hi These features are very useful, especially the ability to select video and audio sources. In addition to the audio on the phone, it is often used with the microphone on the headset, or a dedicated microphone, or to get audio from an external camera source (UVC). Being able to switch between audio sources is really great.

Rider02vn avatar Mar 13 '25 04:03 Rider02vn

@ThibaultBee Hi. I am having a problem. On the UVC preview surface, the video is displayed in the correct size (1920x1080) and orientation (landscape) as expected, but when I livestream (encoder), the server receives the video with the orientation reversed even though the size is still 1920x1080. How can I fix it?

Rider02vn avatar Mar 13 '25 08:03 Rider02vn

I have fixed that issue. Need to fix in class VideoMediaCodecEncoder => CodecSurface => onFrameAvailable and rotate stMatrix

Rider02vn avatar Mar 13 '25 09:03 Rider02vn

I have fixed that issue. Need to fix in class VideoMediaCodecEncoder => CodecSurface => onFrameAvailable and rotate stMatrix

It is probably already fixed in v3

ThibaultBee avatar Mar 14 '25 13:03 ThibaultBee