moonlight-android
moonlight-android copied to clipboard
[Issue]: Fire Stick 4K Max (2023) high latency
Describe the bug
The Fire Stick 4K Max (2023) has higher than expected latency given the hardware specs compared to the last generation (2021).
I observed the following benchmarks:
- 4K60 50Mbps HEVC: 18ms average (similar H.264)
- 1080p60 50Mbps HEVC: 15ms average (similar H.264)
Enabling Game Mode on the Fire Stick did not make any difference. For comparison, the previous generation Fire Stick 4K Max (2021) achieves 4K60 80Mbps HEVC at 4ms average.
Setup:
- 5900X / 3080Ti / 2.5GbE Ethernet
- Sunshine: NVENC preset P1 (lowest latency)
- ASUS AXE6600 (ET8) (Wifi 6E)
One additional thing I noticed is that it seems to receive network bottlenecking after 50Mbps where my network latency jumps from 2-3ms to 8-12ms. This is tested using Wifi 6E on AXE6600 router (no other clients have network issues running 4K120 150Mbps). Using the built in Fire Stick internet speed test, I am seeing a max download speed of 419Mbps. This network issue may be hardware related but was not observed on the previous generation.
Steps to reproduce
- Download Moonlight on Fire Stick 4K Max (2023)
- Connect to a PC
- Enable Moonlight statistic
- Start any game
Affected games
Tested using Desktop and opened Minecraft, Heaven Benchmark, and Timespy.
Other Moonlight clients
PC
Moonlight adjusted settings
Yes
Moonlight adjusted settings (please complete the following information)
I have tested using H.265 and H.264. Adjusted bitrate from 20-80Mbps (observed the bottlenecking past 50Mbps).
Moonlight default settings
Yes
Gamepad-related connection issue
No
Gamepad-related input issue
No
Gamepad-related streaming issue
No
Android version
Android 11
Device model
Fire Stick 4K Max (2023)
Server PC OS version
Windows 11
Server PC GeForce Experience version
0.21
Server PC Nvidia GPU driver version
546.17
Server PC antivirus and firewall software
Windows Defender
Screenshots
No response
Relevant log output
No response
Additional context
No response
It looks like they made a switch from Android 9 (Fire Stick 2021) to Android 11 (Fire Stick 2023). Could this be related?
Yikes, looks like the android 11 and higher mediatek bug again. My hunch is this report is the same issue as:
https://github.com/moonlight-stream/moonlight-android/issues/1241
Where they've started using the "modern" c2 decoder which doesn't work with the magic "vdec-lowlatency" fix that got media tek decoders working fast for moonlight on older Android versions with the same chipset.
Can you check your 2023 fire tv stick 4k max and see what decoder it's using?
My guess is the new fire tv stick 4k max (2023), based on android 11, will show:
C2.MTK.HEVC.DECODER
Instead of the older android 9 sticks that use the omx decoder api.
Let us know what you see, thanks!
It's using OMX.MTK.VIDEO.DECODER.HEVC
Oh weird, not what I expected! Could you perhaps sideload this codec info app onto your fire stick?
https://github.com/Parseus/codecinfo/releases/download/release_2.5.1/app-standard-tv-release.apk
and use that app to screenshot / dump as much as you can for the codecs that it lists related to HEVC? I know it will be a pain on a TV, but curious what extensions it claims to support.
Also a logcat output from the ADB debug interface would be helpful too, I can compare that to my older 4k max (2021) and see if the decoder is loading the same way on the new 2023 max stick.
Enable adb over network and connect from a computer: https://developer.amazon.com/docs/fire-tv/connecting-adb-to-device.html
Run logcat over the adb shell: https://developer.android.com/tools/logcat
Usually just "adb logcat" will suffice after you've run the stream for a few seconds.
Thanks!
Just bought a fire stick 4k max (2023 version) and can confirm decoding latency sits around 15-16 ms no matter which settings you use. I will try to supply some of the information you requested above @peacepenguin tomorrow once I get some time!
Just purchased this model of firestick and i am experiencing the same issue described. i am a software engineer but i don't have experience with video codecs, so all i can do is follow @peacepenguin's instructions and i will report my findings as soon as i can.
I switched yesterday from a 4k Max Gen 1 (2021) to a 4k Max Gen 2 (2023) and didn't change anything other in my setup. So PC settings (including sunshine settings) and network settings stayed exactly the same. Both Fire TV sticks are connected via ethernet dongle. In moonlight I chose HEVC as codec. Now I am experiencing the same issues as @LoserCard.
On my gen 1 stick (2021) decoding latency was around 5 ms. Now on my gen 2 stick (2023) latency is between 15 ms and 20 ms.
As @peacepenguin asked I am attaching the codec infos for HEVC from my 4K Max Gen 2 stick (2023) gathered with parseus codec info.
There are 4 entries under VIDEO in parseus codec infos in the category video/hevc:
OMX.MTK.VIDEO.DECODER.HEVC
Hardware acceleration
true
Software-only
false
Codec provider
Device vendor / OEM
Max supported instances
4
Max resolution
3840x2176
Max bitrate
157 Mbps
Frame rate
1-120fps
Max frame rate per resolution
144p: 120,0 fps
144p (YouTube): 120,0 fps
240p:120,0 fps
240p (widescreen): 120,0 fps
360p:120,0 fps
360p (widescreen): 120,0 fps
480p: 120,0 fps
480p (widescreen): 120,0 fps
576p: 120,0 fps
720p: 120,0 fps
1080p:120,0 fps
4K: 64,5 fps
Color profiles
COLOR_Format1 6biRGb565 (0x6)
COLOR_Format32bitABGR8888 (Ox7F00A000)
COLOR_Formatyuv420Flexible (0x7F420888)
COLOR_Formatyuv420Planar (0x13)
COLOR_FormatYUV420SemiPlanar (0x15)
OMX_COLOR_FormatVendorMTKYUV (Ox7F000001)
Adaptive playback
true (required: false)
Partial frames queuing
false
Secure playback decryption
falee
Dynamic timestemp
felee
Multiple access units
false
Tunneled playback
true (requlred: false)
Partial access units per Input buffer
False
false
Profile levels
HEVCProfleMain (0x1): HEVCHighTierlevel51 (0x20000)
HEVCProfileMain0 (Ox2): HEVCHighTierLevel51 (Ox20000)
HEVCProfileMain10HDR10 (0x1000): HEVCHighTierLeve151 (0x20000)
HEVCProfileMain10HDR10Plus (0x2000): HEVCHighTierLevel51 (Ox20000)
OMX.MTK.VIDEO.DECODER.HEVC.secure
Hardware acceleration
true
Software-only
false
Codec provider
Device vendor/ OEM
Max supported instances
Max resolution
3840x2176
Max bitrate
160 Mbps
Frame rate
1-120fps
Max frame rate per resolution
144p; 120,0 fps
144p (YouTube): 120,0 fps
240p:120,0 fps
240p (widescreen): 120,0 fps
360p:120,0 fps
360p (widescreen): 120,0 fps
480p:120,0 fps
480p (widescreen): 120,0 fps
576p:120,0 fps
720p: 120,0 fps
1080p:120,0 fps
4K: 64,5 fps
Color profiles
COLOR_Format1 6bitRGB565 (Ox6)
COLOR_Format32bitABGR8888 (Ox7F00A000)
COLOR_FormatYUV420Flexible (0x7F420888)
COLOR_FormatyUv420Planar (0x13)
COLOR_FormatYUV420SemiPlanar (Ox15)
OMX_COLOR_FormatVendorMTKYUV (Ox7F000001)
Adaptive playback
true (required: false)
Partial frames queuing
false
Secure playback decryption
true (required: true)
Dynamic timestamp
false
Multiple access units
false
Tunneled playback
true (required: false)
Partial access units per input buffer
false
Profile levels
HEVCProfileMain (Ox1): HEvcHighTierlevel51 (0x20000)
HEVCProfleMain1o (Ox2): HEVCHighTierlevel51 (0x20000)
HEVCProfileMain10HDR10 (0x1000); HEVCHighTierLevel51 (Ox20000)
HEVCProfileMain10HDR1OPlus (Ox2000): HEVCHighTierLevel51 (Ox20000)
c2.android.hevc.decoder
Hardware acceleration
false
Software-only
true
Codec provider
Android platform
Max supported instances
32
Max resolution
4096x4096
Max bitrate
10 Mbps
Frame rate
0-960 fps
Max frame rate per resolution
144p:960,0 fps
144p (YouTube): 960,0 fps
240p:960,0 fps
240p (widescreen): 960,0 fps
360p:740,7 fps
360p (widescreen); 555,6 fps
480p: 416,7 fps
480p (widescreen): 311,5 fps
576p: 308,6 fps
720p: 138,9 fps
1080p:61,7 fps
4K: 15,4 fps
Color profiles
COLOR_FormatYUV420Flexible (Ox7F420888)
COLOR_FormatYUV420PackedPianar (0x14)
COLOR _FormatYUV420PackedSemiPlanar (0x27)
COLOR_FormatyUv420Planar (0x13)
COLOR_FormatYUV420SemiPlanar (0x15)
Adaptive playback
true (required: false)
Partial frames queuing
false
Secure playback decryption
false
Dynamic timestamp
false
Multiple access units
false
Tunneled playback
false
Partial access units per input buffer
false
Profile levels
HEVCProfileMain (0x1): HEVCHighTierLevel52 (0x80000)
HEVCProfileMainStill (0x4): HEVCHighTierLevel52 (0x80000)
c2.android.hevc.encoder
Hardware acceleration
false
Software-only
true
Codec provider
Android platform
Max supported instances
32
Max resolution
960X544
Max bitrate
10 Mbps
Frame rate
1-120 fps
Max frame rate per resolution
144p: 120,0 fps
144p (YouTube): 120,0 fps
240p: 120,0 fps
240p (widescreen): 120,0 fps
360p:90,7 fps
360p (widescreen): 68,0 fps
480p: 51,0 fps
480p (widescreen): 38,1 fps
Color profiles
COLOR_FormatSurface (Ox7F000789)
COLOR_Formatyuv420flexible (0x7f420888)
COLOR_FormatYUV420PackedPlanar (0x14)
COLOR_FormatYUV420PackedSemiPlanar (0x27)
COLOR_FormatyUv420Planar (0x13)
COLOR_FormatYUV420SemiPlanar (0x15)
Intra refresh
false
Dynamic timestamp
false
Multiple access units
false
Bitrate modes
Constant bitrate (CBR): true
Constant quality (CQ): true
Variable bitrate (VBR): true
Encoding complexity range
0-10 (default: 0)
Encoding quality range
0- 100 (default: 80)
Profile levels
HEVCProfileMain (0x1): HEVCMainTierLevel52 (0x40000)
HEVCProfileMainStill (0x4): HEVCMainTierLevei52 (0x40000)
@peacepenguin now for your question concerning the adb logcat logs:
I started logcat over adb and then connected with moonlight to my pc running sunshine. I let the stream running a couple of secs and used an xbox controller to make some commands. Hope this suffices. If not please let me know. I am happy to help.
@mrratherford The log cat looks good, there's no errors with the needed flags being applied to the decoder during the moonlight initialization like we see on some other newer mediatek devices. I see the correct 'vdec-lowlatency' flag for example being set.
Can you run parsec and see if it performs better or worse or the same? On some other mediatek devices parsec works great, but moonlight is high latency, lets see if parsec has solved the issue and we can reverse engineer from there if there's optimization possible.
@peacepenguin Strangely, in parsec the decoding time is very low, like 2 to 3ms. unfortunately I can't play like I did on Moonlight with 50 bitrate (Parsec doesn't seem to know how to manage the network very well on Fire TV and the network latency is very high most of the time, unlike Moonlight)
Here are some photos with 15, 20 and 50 bitrate and it is incredibly playable. (except for the 15 bitrate, sometimes it gets very grainy in fast movements.) I'm willing to carry out more tests within parsec if necessary.
@SnowJ7Z something is definitely up with networking in Moonlight too. I noted in the issue description that the threshold between 50-60mbps caused really bad latency in Moonlight. Hopefully both these issues can be addressed at the same time with a reverse engineering of Parsec.
@SnowJ7Z can you set parsec to h265 (hevc) and share those results too? So far all we're looking at in depth is h265. Thanks!
Apparently everything is fine
Ok good to see parsec is working well, 3ms decode time is fantastic. It means that we should be able to make moonlight at LEAST as good.
Can someone capture a logcat of parsec using h265 on the 2023 fire tv stick 4k max?
Last parsec log I looked at was a little wonky, like they're bypassing the normal draw to screen functions and using their own. They seem to have found a trick to bypass some of the buffering that were seeing on mediatek decoders in android lately.
@peacepenguin My version is the fire tv stick 4k 2023, basically the errors that are occurring are the same as in the max version. I'll send the logcat in an hour, maybe less.
Yes makes sense the only difference I see in the spec sheet for the 2023 4k vs 4k max is the storage space and clock speed of the cpu and gpu. Looks like they have the same chipset. So yes, the issue I imagine will be identical on both. Thanks for collecting the data, it will make finding a fix much easier!
I don't know if I did it right, but here it is. Logcat.txt
Thanks @SnowJ7Z and @mrratherford the logcats are really helpful
Try this apk I just compiled, I just made a quick change to the decoder config to closer align it to parces.
You'll need to uninstall your existing moonlight app before installing this modified one:
https://github.com/peacepenguin/moonlight-android/releases
Changes: https://github.com/peacepenguin/moonlight-android/commit/7ff2561fd3632313f974f7d5474bb256402d084d
@peacepenguin Just try this on my Mediatek tablet and it is still using the c2 decoder not the omx one. The soc is Helio G99.
@peacepenguin Still unsuccessful, still within a range of 13 to 16ms
@peacepenguin I'm seeing the same results as @SnowJ7Z .
Moonlight (https://github.com/peacepenguin/moonlight-android/commit/7ff2561fd3632313f974f7d5474bb256402d084d) runs with 10 -16 ms decoding latency. The only different to version 0.21 is that the screen is black for a couple of seconds after connecting to the sunshine server. but after this the stream runs stable.
here are the logs for Moonlight (https://github.com/peacepenguin/moonlight-android/commit/7ff2561fd3632313f974f7d5474bb256402d084d): logcat_moonlight_7ff2561_firetv_4k_max_gen2_2023.txt
i installed parsec (v150-85c) on the fire tv 4k max (2023) and looked at its decoding latency. parsec seems to have some tricks up its sleeve, because it's decoding latency was around 3 ms:
I'm attaching the logs for parsec, too: logcat_parsec_150-85c_firetv_4k_max_gen2_2023.txt
(looking at the logs now i'm not sure if I captured the moment connecting to the host with parsec. If not please let me know and I will try again)
@SnowJ7Z thanks for testing, good news is adaptive playback isn't causing trouble, that's a decoder feature that we can re-enable for these newer sticks that was previously causing problems.
@mrratherford thanks for testing and sending more logs, it's very helpful.
I'm now looking at these settings that are different and will make some further changes to my fork to hopefully get moonlight decoding sub 5ms on these sticks.
I'll post test builds as I progress.
parsec:
MediaCodecLogger: HW.omx.video.hevc Adaptive Streaming max w:3840 h:2160, screen size:FHD
VPUD : MaxFixedBuf Info, w*h=0*0, mode=0
VPUD : [Vdec_initFixedMaxMode ,4789] bFixedMaxBuffer:0 size(0 0), bMetaEnhance=0
VPUD : [Info] bMetaEnhance=0, bFixedMaxBuffer=0
## consistent 'no free buffer' errors 10 per sec logged during streaming, indicating buffer is never available
## which based on the great performance, assume this is a good error to have
I MtkOmxVdecExV4L2: [0xee4bf140] FBD, mNumPendingOutput(13), input_index = 3, output_index = 0 (0xee69b180)(2, 1)
E MtkOmxVdecExV4L2: [0xee4bf140] PrepareAvaliableColorConvertBuffer, no free buffer can't queue to CCDst
vs moonlight with adaptive playback enabled (https://github.com/peacepenguin/moonlight-android/commit/7ff2561fd3632313f974f7d5474bb256402d084d):
com.limelight.LimeLog: Configuring with format: {max-height=1080, color-transfer=3, vdec-lowlatency=1, max-width=1920, low-latency=1, mime=video/hevc, width=1920, color-range=2, priority=0, frame-rate=60, color-standard=1, height=1080}
MediaCodecLogger: HW.omx.video.hevc Adaptive Streaming max w:1920 h:1080, screen size:4K
VPUD : MaxFixedBuf Info, w*h=3840*2176, mode=3
VPUD : [Vdec_initFixedMaxMode ,4789] bFixedMaxBuffer:0 size(3840 2176), bMetaEnhance=1
VPUD : [Info] bMetaEnhance=1, bFixedMaxBuffer=0
VPUD : [Info] bMetaEnhance, PicAlloc(1920,1088), PIC(1920,1088) u4DpbSize=3
After digging through lots of code and logs, I think what we need to do next is implement "Tunnel Mode" (or Multimedia Tunneling) for audio/video playback in moonlight-android.
The codec info posted for the fire tv 4k max 2023 shows "Tunneled Playback" as a supported feature of the hardware hevc decoder.
The notes for Tunneled mode are very promising from both google's android and amazon's fire os implementation notes for low latency video playback.
I'm looking into how to convert my fork of moonlight-android over to use Tunneled Playback mode. It looks easy based on the docs linked below, but I'm not sure where to initialize the audiosessionid that is required to sync the video to the audio. Both the audio and video stream need to reference the same, pre-created audiosessionid when they are instantiated. Currently in moonlight-android the audio decoder gets created after the video decoder, I think we need to switch that around and setup the audio decoder first, with the audiosessionid, then pass that same audiosessionid to the video decoder too.
Then just need to figure out how to remove the dequeueOutputBuffer and releaseOutputBuffer traditional rendering steps as noted in amazon's guidance.
https://developer.amazon.com/docs/fire-tv/4k-tunnel-mode-playback.html
Tunnel Mode Playback
The hardware decoder for some Fire TV devices support playback of 4K @ 60 FPS. To play a video at such high resolution and frame rate, the timing requirement of media pipeline is very aggressive and the app may not be able to render 4K frames at 16 msec interval due to thread and process scheduling limitations of the kernel. This may cause frame drops and a sub-par movie experience. To get the best out of the hardware, use Tunnel Mode playback.
Do not call dequeueOutputBuffer and releaseOutputBuffer for video decoder.
https://source.android.com/docs/devices/tv/multimedia-tunneling
tunneled video playback bypasses the app code and reduces the number of processes acting on the video, it can provide more efficient video rendering
@SnowJ7Z not sure but to test audio sync and latency issues I use this video on a loop:
https://www.youtube.com/watch?v=TjAa0wOe5k4
First just make sure it's perfect when you sit in front of your computer, direct attached output to monitor and speakers, to ensure the base rendering is working as intended (no sunshine/moonlight involved here, just gotta get ensure even local playback is perfect first)
Then open up moonlight, desktop session, and see if the audio and video is still in sync or not when viewing the same video over moonlight.
So far in my experience moonlight has been perfect for AV synchronization, so it sounds like the issue you are having would be from the game itself not rendering the audio and video in sync. But you can deduce that down with the above test and see where the issue is coming from.
Glad to see folks are already digging into this, just bought a Max mostly because I was excited about using moonlight with it. Let me know if there's anything I can do to help as far as logs or whatever. I can also potentially help with dev since I know Java really well, but my experience with these streaming protocols is pretty limited.
@BuiltInParris
I'm digging into this deep for academic reasons, and i just snagged a new max 2023 to test with.
I could use your help, as java is not my main to say the least!
I'm trying to find where the AudioTrack object "track" gets created, and move it to before the video decoder gets initialized. The goal being to implement Tunnel Mode decoding. Documented in very few places on the web, with even fewer examples.
I need to get the audiosessionid from the "track" object, and pass it to the video decoder. Currently the video decoder is setup before the audio decoder, so I can't get the audiosessionid where I need it.
Let me know if you can look at that, I'll post my progress and test builds in this thread so we can stay in touch.
I should probly hop on the discord and see if any of the maintainers can weigh in on moving to Tunnel Mode decoding in general.
@peacepenguin Looking really quickly (busy weekend for me), it looks like it's done in the AndroidAudioRenderer. The method createAudioTrack is responsible for creating the AudioTrack object, which happens in the setup method and that's called by bridgeArInit.
Honestly a bit confused by how exactly the method "Java_com_limelight_nvstream_jni_MoonBridge_init" works, but it seems like that's where they make all these statics calls to initialize each component. "bridgeArInit" makes the call inbound to create the AudioTrack. It seems that "bridgeDrSetup" is responsible for making the video decoder. You should just be able to swap those two to get the effect you want. You can find the calls in callbacks.c.
thanks @BuiltInParris
I've now got the audio session ID generated and passed down to the audio and video rendering startup as params, seems ok. As soon as i start the session things hang and crash though, which makes sense, as I've not removed the traditional rendering methods for the video, which amazon says explicitly needs removed in tunnel playback mode:
Do not call dequeueOutputBuffer and releaseOutputBuffer for video decoder.
https://developer.amazon.com/docs/fire-tv/4k-tunnel-mode-playback.html
I'm pretty much stuck here, I have no idea what to replace those calls with. dequeue and release output buffer is how the screen is drawn. I don't see any further documentation on how your supposed to draw the screen without those in existence.
Non-working wip can be found here. At least it has the audio session id being passed into both audio and video now: https://github.com/moonlight-stream/moonlight-android/compare/master...peacepenguin:moonlight-android:tunnelmode
That diff is from my 'tunnelmode' branch in my fork of moonlight-android here: https://github.com/peacepenguin/moonlight-android/tree/tunnelmode
Please chime in if anyone knows what to look at next. This all to enable "tunnel mode playback", which amazon says: "To get the best out of the hardware, use Tunnel Mode playback."
Google says it reduces latency: https://source.android.com/docs/devices/tv/multimedia-tunneling
And these sticks support it per their codec output. So should shave some decode latency off the top by switching to tunnel mode.
So I think we're on a good path here, I'm just quickly reaching my limit on Java programming, especially with basically no implementation examples on how to use it from google nor amazon.
Anyone with Java experience please take a look at the links above and my wip code and see if you can help.
Thanks all!
I don't know much about Java or video streaming in general, but, just in case you haven't seen this (although I'm sure you probably have)
https://source.android.com/docs/devices/tv/multimedia-tunneling#:~:text=Multimedia%20tunneling%20enables%20compressed%20video,code%20or%20Android%20framework%20code.
There is some information on the android documentation which might be helpful.
The other source of example use is the ExoPlayer source which again you have probably seen.
Thanks again for your help and efforts, it feels like a solution is close!
Hey @peacepenguin! I just picked up a sony X85K TV and am seeing the same issue (also based on android Q), but I was able to get multimedia tunneling working on it using your branch. Here are the changes I made:
-
dequeueOutputBuffer
andreleaseOutputBuffer
(functions which time the display of the frame to the activity) are no longer needed in media tunneling because the timing of the frame is now synced to the timestamp in the audio track (more info here: https://source.android.com/docs/devices/tv/multimedia-tunneling). I was able to stop moonlight from calling these functions by commenting out these lines:https://github.com/prototypicalpro/moonlight-android/blob/63fac0c3c1aa51791048ec503f06516fb7e2d1f3/app/src/main/java/com/limelight/binding/video/MediaCodecDecoderRenderer.java#L1231-L1236
Video will still be sent to the hardware codec through
queueInputBuffer
.queueInputBuffer
is called regularly by the moonlight C code, so no changes were needed there. -
Since the video will sync to the audio, the audio needs some kind of timestamp the video can reference. This timestamp is provided in the fourth parameter of AudioTrack.write:
track.write(byteBuffer, byteBuffer.remaining(), AudioTrack.WRITE_NON_BLOCKING, System.nanoTime() - MoonAudioStartTS);
If this parameter isn't provided, the video player will be stuck on a black screen. (There is probably a more precise way to compute the current timestamp than what I'm doing here).
-
Finally, it seems the
FLAG_HW_AV_SYNC
flag on AudioTrack adds a whole new layer of hardware support. Specifically, on my Sony TV stereo audio coded as PCM 16-bit supports AV sync, but 5.1 audio as PCM 16 does not. To use AV sync with more than two channels, only proprietary surround formats are supported:// tryMakeSyncAudioTrack(encoding, sampleRate, channelMask) tests FLAG_HW_AV_SYNC and AudioTrack tryMakeSyncAudioTrack(AudioFormat.ENCODING_PCM_16BIT, 48000, AudioFormat.CHANNEL_OUT_STEREO); // ok tryMakeSyncAudioTrack(AudioFormat.ENCODING_PCM_16BIT, 48000, AudioFormat.CHANNEL_OUT_5POINT1); // error tryMakeSyncAudioTrack(AudioFormat.ENCODING_AC3, 48000, AudioFormat.CHANNEL_OUT_5POINT1) // ok tryMakeSyncAudioTrack(AudioFormat.ENCODING_AC4, 48000, AudioFormat.CHANNEL_OUT_5POINT1) // ok
This means that if your moonlight is set to 5.1 with media tunneling enabled the stream will crash. Parsec only supports stereo audio so they probably haven't encountered this issue.
All my changes to your branch are here: https://github.com/peacepenguin/moonlight-android/compare/tunnelmode...prototypicalpro:moonlight-android:tunnelmode?expand=1 . I haven't run any latency tests since my audio sync code is jank and I'm sure it will cause issues, I'd be interested if moonlight has a better way to compute the audio timestamp.