ALVR
ALVR copied to clipboard
[Feature Request] Disable encoding (raw video transfer)
I'm wondering if adding a feature to disable video encoding entirely would be feasible?
This might sound silly at first but there is a good reason why I would like to do such a thing. Currently I am using ALVR to connect to my Quest 3 via port forwarding using ADB. I have determined that this is a USB 3.2 connection and in experimental testing using iperf I was able to achieve data transfer rates of up to 20.8Gbits/s before the ADB connection broke down due to network congestion.
Unfortunately tuning the compression settings to target a bitrate that high dosent seem to work, at least nvenc fails at this task. High bandwith targets also seem to come with a significant hit in latency, anything above 300Mbits/s adds at minimum 20ms of latency.
I propose to add an option that would disable encoding entirely. This could possibly get rid of any encoding and decoding latency in the system while offering higher visual fidelity.
A quick calculation can confirm that such datarates are feasible:
2 (number of screens) × 2064 × 2208p (quest resolution) × 120Hz (refresh rate) × 12bpp (bytes per pixel at 8 bit color) = 13.12Gbits/s
(This could possibly be reduced using chroma subsampling or by lowering the refresh rate)
I believe that adding this feature would not be very complex, and that the following changes are needed:
- Adding a new
EncodePipelineDummy(or other name) (inalvr/server/cpp/platform/linux) which copies the vulkan image on the GPU into a CPU buffer - Adding a new option to disable encoding in
Settings(inalvr/server/cpp/alvr_server/Settings.h) - Adding a new UI element to allow the user to toggle the disable encoding option
- Adding a new client decoder for raw video data (possibly in
alvr/client_core/src/platform/android/decoder.rs)
Id love to contribute to make this a feature. I'm not confident to write any rust code, but I would like to contribute the new encoding pipeline if possible.
I'm currently rewriting the linux rendering pipeline in order to get SteamVR direct mode working, so I'd a) advise you to not add features to the linux code if you don't want to redo your work in a couple of weeks (given that valve actually fixes their stuff that quickly which you really can't be too sure of) or work off my branch (I'd need to clean it up a bit first) and b) offer to help with these changes to the encoder, since I'm pretty familiar with that part of the codebase currently.
One would need to change quite a bit of code on the clientside though, although you might just need to bypass a lot of code. The information that the video is raw would also require changes to the protocol, which means this change would need an new alvr release.
I'd also recommend joining the alvr discord server, since that makes development quite a bit easier, especially seeing as this will require quite a bit of coordination.
I believe such high bitrates might cause OOM. And as you know, there is no way to enable USB streaming without ADB so we need to go through the network stack which might make this less feasible.
ALso, before jumping to uncompressed, i'd look into MJPEG encoding, which is equivalent to h264/HEVC with only IDR frames.
I'm currently rewriting the linux rendering pipeline in order to get SteamVR direct mode working, so I'd a) advise you to not add features to the linux code if you don't want to redo your work in a couple of weeks (given that valve actually fixes their stuff that quickly which you really can't be too sure of) or work off my branch (I'd need to clean it up a bit first) and b) offer to help with these changes to the encoder, since I'm pretty familiar with that part of the codebase currently.
I'm not too familiar with VR and what SteamVR's direct mode is, but I'd happy to work on your branch until that gets merged.
One would need to change quite a bit of code on the clientside though, although you might just need to bypass a lot of code. The information that the video is raw would also require changes to the protocol, which means this change would need an new alvr release.
I agree this would require some amount of client side changes like I pointed out in my original issue. Looking at the protocol I believe that the client side changes can be made without any breaking changes, but the server changes will cause some strange behaviour on the client side if a new compression was to be added. This is because the function that handles setting the configuration data does not perform proper validation of the incoming data.
I believe such high bitrates might cause OOM. And as you know, there is no way to enable USB streaming without ADB so we need to go through the network stack which might make this less feasible.
The test I ran on network transmission speed was running trough the whole stack, so these rates are indeed possible. More specifically ever packet traveled like so: iperf3 client (on host) -> host network -> ADB server -> USB3.2 connection -> android internals -> device network -> termux emulator -> iperf3 server (on device)
ALso, before jumping to uncompressed, i'd look into MJPEG encoding, which is equivalent to h264/HEVC with only IDR frames.
I believe that an uncompressed protocol is significantly easier to implement which is partly the reason I am suggesting it. Another reason is that any form of compression will inherently add latency, quality losses and will increase the load on the GPU. If the bandwidth allows for it, I don't see any reason to use compression.
I'll join the discord server now. My name on there is the same as here
Another reason is that any form of compression will inherently add latency, quality losses and will increase the load on the GPU. If the bandwidth allows for it, I don't see any reason to use compression.
Yes, but the cost of simple compression and decompression might be a lot lower than the cost of the entire network stack, especially when it comes to the client, which depending on the actual hardware might be able to decode the image in hardware, and the encode cost isn't that high either, so it may actually be lower cost than pushing images through the network stack and might actually be better for latency cuz the network stack is the main thing that's prone to stalling.
There was some work some time ago to reduce the amount of data copies and allocations in the network code. by removing 1 allocation per video packet and 2 data copies, it was possible to reduce up to 5ms of latency. So that's why i'm skeptical of increasing the amount of data that the CPU need to process, or simply move around
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
Would be very interested in this too, was actually going to open an issue myself for it if it didn't have one.
@The-personified-devil
I'm currently rewriting the linux rendering pipeline in order to get SteamVR direct mode working, so I'd a) advise you to not add features to the linux code if you don't want to redo your work in a couple of weeks (given that valve actually fixes their stuff that quickly which you really can't be too sure of) or work off my branch (I'd need to clean it up a bit first) and b) offer to help with these changes to the encoder, since I'm pretty familiar with that part of the codebase currently.
What is the state of this rewrite? It this already present upstream or in a position where code could be contributed?
Also is there anyone who is familiar with the client code and may be able to help to implement this feature? I'm not very well versed in rust and could not contribute any client side code.
What is the state of this rewrite? It this already present upstream or in a position where code could be contributed?
valve still haven't fixed their stuff, so it's still just a not quite finished pr
I may be able to help with some of the code if someone can make a starting point that can point me in the right direction, I'm experienced in rust but not so much with hardware access and compression type stuff.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.