moonlight-android icon indicating copy to clipboard operation
moonlight-android copied to clipboard

Is it possible to lower the decoder latency to 5ms or less?

Open tankxiaodi opened this issue 2 years ago • 3 comments

I saw this guy was using moonlight to stream a 1080p game to his android phone while the decoder latency was 3ms or so, and he didn't mention his setup up.

Here's the video: https://www.bilibili.com/video/BV1AS4y1f7LR?share_source=copy_web&vd_source=c382a012d6ea71f111a2c8fd4f6e5ad3

I noticed that the OMX.hisi.video.decoder.hevc was being used in his case, is this decoder the reason to archive such a low latency?

In my case, I use pixel5a and rtx 3060 as the moonlight server and always get about 10ms decoder latency no matter h.264 or h.265.

tankxiaodi avatar Sep 03 '22 10:09 tankxiaodi

It depends on the hardware decoder on your SoC and the media drivers. Not all hardware decoders can achieve the same latency.

Moonlight requests low latency decoding using the standard KEY_LOW_LATENCY option introduced in Android 11, so that should basically give you the best latency that the hardware and drivers are capable of.

cgutman avatar Sep 03 '22 17:09 cgutman

does this mean if a device that has Android 10 (like the Ayn Odin Pro - using an SD845 based SOC) that would explain why users are reporting latency of ~13ms to ~17ms? Or can low latency be enabled on that with code changes to the client?

scyto avatar Sep 17 '22 00:09 scyto

It is possible that there are vendor-specific options to enable low latency on devices that don't support the standard KEY_LOW_LATENCY option, however these options are not widely available. Even on devices with drivers that support low latency mode, many do not expose the option to apps outside of the core system, so Moonlight can't use it.

Moonlight does enable the known vendor-specific low latency option for Qualcomm SoCs, so we're probably getting the best latency that we can on that device.

cgutman avatar Sep 17 '22 19:09 cgutman

@cgutman Thanks a lot, I've upgraded to a Qualcomm 870 device and 3ms and so.

tankxiaodi avatar Nov 08 '22 08:11 tankxiaodi

Hi I'm on SD888 and I get 12-18ms decoding time how can I enter low latency Key?

HamzaHKR avatar Nov 08 '22 11:11 HamzaHKR

could this be implemented for exynos samsung devices phones or it's only possible for qualcom?

apoklyps3 avatar Mar 01 '23 05:03 apoklyps3

I found out, that on my Windows 11 client with Intel UHD 630 iGPU, I get WAY higher decoding latency, if I use default fullscreen mode, of around 12ms. If I use borderless window mode, I get decode +vsync latency of just 0,6ms. Not sure if that is a measurement error? Im referencing to the ctrl alt shift s stats.

makedir avatar Aug 25 '23 22:08 makedir

My test result: 2k 60fps h264. Host: 3060ti Client: gtx650

Decode latency is around 1ms.

Mobile phone client 8ms.

LeadroyaL avatar Sep 17 '23 12:09 LeadroyaL

Bump again on the exynos question. Can this be enabled for phones that have exynos? On my s22 ultra the decoding latency doesn't go below 15ms. It mostly sits at 20+

apoklyps3 avatar Nov 30 '23 08:11 apoklyps3

Bump again on the exynos question. Can this be enabled for phones that have exynos? On my s22 ultra the decoding latency doesn't go below 15ms. It mostly sits at 20+

I own Samsung S8+ that comes with Exynos 8895 and I am getting 8 to 10ms decoding latency. But not sure why I am getting around 4ms when streaming using parsec.

Note, I mostly test it in Samsung dex mode

Amoudi05 avatar Jan 21 '24 22:01 Amoudi05

Can you tell me what decoder it's using for your s8+. it should be displayed in the overlay stats. something is getting set seriously wron on s22 ultra to get sucha a high decoding latency.

apoklyps3 avatar Jan 22 '24 05:01 apoklyps3

Can you tell me what decoder it's using for your s8+. it should be displayed in the overlay stats. something is getting set seriously wron on s22 ultra to get sucha a high decoding latency.

I use it at 1080p, while I try to see what the latency is at 4k just to compare. I am using 45 Mbps as the bitrate for the 1080p, while for 4K, I used the default.

Screenshot_20240122-120816_Moonlight

Screenshot_20240122-120415_Moonlight

I am not sure if you need these information. My computer's GPU is an RX 590, and the CPU is an AMD 5 2600X.

Amoudi05 avatar Jan 22 '24 09:01 Amoudi05

I think your GPU is better. I have an Nvidia GeForce 1070, but I don't think this is the cause. Even on static desktop stream my average decoding time is above 20ms. The culprit seems the decoder used: c2.exynos.hevc.decoder

apoklyps3 avatar Jan 22 '24 15:01 apoklyps3