moonlight-android
moonlight-android copied to clipboard
Is it possible to lower the decoder latency to 5ms or less?
I saw this guy was using moonlight to stream a 1080p game to his android phone while the decoder latency was 3ms or so, and he didn't mention his setup up.
Here's the video: https://www.bilibili.com/video/BV1AS4y1f7LR?share_source=copy_web&vd_source=c382a012d6ea71f111a2c8fd4f6e5ad3
I noticed that the OMX.hisi.video.decoder.hevc was being used in his case, is this decoder the reason to archive such a low latency?
In my case, I use pixel5a and rtx 3060 as the moonlight server and always get about 10ms decoder latency no matter h.264 or h.265.
It depends on the hardware decoder on your SoC and the media drivers. Not all hardware decoders can achieve the same latency.
Moonlight requests low latency decoding using the standard KEY_LOW_LATENCY option introduced in Android 11, so that should basically give you the best latency that the hardware and drivers are capable of.
does this mean if a device that has Android 10 (like the Ayn Odin Pro - using an SD845 based SOC) that would explain why users are reporting latency of ~13ms to ~17ms? Or can low latency be enabled on that with code changes to the client?
It is possible that there are vendor-specific options to enable low latency on devices that don't support the standard KEY_LOW_LATENCY
option, however these options are not widely available. Even on devices with drivers that support low latency mode, many do not expose the option to apps outside of the core system, so Moonlight can't use it.
Moonlight does enable the known vendor-specific low latency option for Qualcomm SoCs, so we're probably getting the best latency that we can on that device.
@cgutman Thanks a lot, I've upgraded to a Qualcomm 870 device and 3ms and so.
Hi I'm on SD888 and I get 12-18ms decoding time how can I enter low latency Key?
could this be implemented for exynos samsung devices phones or it's only possible for qualcom?
I found out, that on my Windows 11 client with Intel UHD 630 iGPU, I get WAY higher decoding latency, if I use default fullscreen mode, of around 12ms. If I use borderless window mode, I get decode +vsync latency of just 0,6ms. Not sure if that is a measurement error? Im referencing to the ctrl alt shift s stats.
My test result: 2k 60fps h264. Host: 3060ti Client: gtx650
Decode latency is around 1ms.
Mobile phone client 8ms.
Bump again on the exynos question. Can this be enabled for phones that have exynos? On my s22 ultra the decoding latency doesn't go below 15ms. It mostly sits at 20+
Bump again on the exynos question. Can this be enabled for phones that have exynos? On my s22 ultra the decoding latency doesn't go below 15ms. It mostly sits at 20+
I own Samsung S8+ that comes with Exynos 8895 and I am getting 8 to 10ms decoding latency. But not sure why I am getting around 4ms when streaming using parsec.
Note, I mostly test it in Samsung dex mode
Can you tell me what decoder it's using for your s8+. it should be displayed in the overlay stats. something is getting set seriously wron on s22 ultra to get sucha a high decoding latency.
Can you tell me what decoder it's using for your s8+. it should be displayed in the overlay stats. something is getting set seriously wron on s22 ultra to get sucha a high decoding latency.
I use it at 1080p, while I try to see what the latency is at 4k just to compare. I am using 45 Mbps as the bitrate for the 1080p, while for 4K, I used the default.
I am not sure if you need these information. My computer's GPU is an RX 590, and the CPU is an AMD 5 2600X.
I think your GPU is better. I have an Nvidia GeForce 1070, but I don't think this is the cause. Even on static desktop stream my average decoding time is above 20ms. The culprit seems the decoder used: c2.exynos.hevc.decoder