unity-network-hardware-video-decoder icon indicating copy to clipboard operation
unity-network-hardware-video-decoder copied to clipboard

Decoding raw h264 byte[] array packets sent over UDP?

Open ZeoWorks opened this issue 4 years ago • 6 comments

Hi sir, I'm hoping to decode raw h264 packets from a byte[] array, would this be possible with your project? Furthermore, I'm on Windows. Thanks!

ZeoWorks avatar Nov 03 '21 02:11 ZeoWorks

Hi @ZeoWorks,

UNHVD (this project) does that but:

  • carries H.264 packets over custom UDP protocol
  • each H.264 packet is carried in multiple UDP packets (which are limited in size)

So the simple answer to your question is no - it will not work directly with raw h264 byte array.

bmegli avatar Nov 03 '21 19:11 bmegli

UNHVD is built on top of NHVD (Network Hardware Video Decoder).

NHVD is built on top of:

  • MLSP (the protocol)
  • HVD (Hardware Video Decoder)

bmegli avatar Nov 03 '21 19:11 bmegli

HVD has that functionality (decoding from raw h264 packet data) but:

  • it is C project
  • hardware acceleration on Windows is implemented but was never tested
    • it is used on Linux
  • it expects logical H.264 packets (representing encoded frame)
    • so it will not parse H.264 stream for you
  • the build system was used only on Linux
    • may require changes to build on Windows
  • to use it from Unity you would have to:
    • wrap it from C# (e.g. Pinvoke, marshalling, nativearray's, IntPtrs and similar techniques)

It may not be worth the effort:

  • if you already have H.264 byte[] array on Unity side
    • the purpose of UNHVD is to never use managed memory
    • and this will make you roundtrip between managed (your raw H.264) and unmanaged memory

bmegli avatar Nov 03 '21 19:11 bmegli

Summing up:

  • it may be easier for you to look for other solution
  • the functionality of decoding raw H.264 packets is in HVD
  • but using it from Unity is not straightforward

bmegli avatar Nov 03 '21 19:11 bmegli

Hello, bmegli.

I am working with your code for combining depth image and VR in unity. I checked your code that the encoder and decoder code run on Linux. but my unity is running on windows 10. (I already set the LattePanda alpha with your Linux encoder code)

I wonder that if I use WSL (Linux with virtual machine on windows 10) Can I send the decoded data on WSL?

I thought that if I change IP and port of your unity scripts to WSL IP and port (local Linux on windows 10) Am I wrong? I would appreciate it if you could give me some advises.

Thank you. have a good time!

kds60513 avatar Sep 02 '22 03:09 kds60513

Hi @kds60513

First - the question is only loosely related to the issue. Putting that aside...


There are two aspects here: hardware decoding and rendering (whole GUI layer)

I am assuming your question is "can I run hardware decoding on WSL?"

Can I send the decoded data on WSL?

I am assuming what you really ask is "Can I receive and hardware decode data on WSL?" (otherwise it doesn't make sense)


On Windows 11 with WSLg:

  • for H.264 it might be possible with this Mesa merge request (already merged)
  • for H.265 it might be possible with this Mesa merge request (not merged yet)
    • but I am not sure it has 10 bit support (e.g. HEVC Main 10)

On Windows 10, sincerely, I don't know. Those would be related issues:

  • https://github.com/microsoft/WSL/issues/4700
  • https://github.com/microsoft/WSL/issues/6357

If I were you, I would not follow this route.

bmegli avatar Sep 06 '22 14:09 bmegli