jdibenes
jdibenes
Yes, also an additional transformation may be needed seeing that unreal engine is +z = up, +y = right and +x = forward, while hololens is +y = up, +x...
Hello, I think option 2 could mess up the encoder but option 1 could work. If option 1 does not work by itself, then maybe a variable framerate approach using...
All server sockets are blocking and sending is non-overlapped. All stream data is sent through `send_multiple`. `WSASend` should buffer according to this document https://learn.microsoft.com/en-us/previous-versions/troubleshoot/windows/win32/data-segment-tcp-winsock >To optimize performance at the application...
That's awesome. Thanks for sharing your solution.
I had the PV camera look at a stopwatch on the PC monitor, then put the PV video window next to the stopwatch, took a screenshot, and compared the time...
Just to confirm, are you replacing the timestamp in `PV_OnVideoFrameArrived` or in `PV_SendSample`? because `PV_SendSample` is after the encoder stuff.
Maybe the difference is the photons-to-PV_OnVideoFrameArrived delay. Might be able to estimate it by comparing the frame timestamp vs the QPC time when PV_OnVideoFrameArrived starts.
Hi, `CustomMediaSink` and `CustomStreamSink` are just barebones implementations of the `IMFMediaSink` and `IMFStreamSink` interfaces and their purpose is to intercept encoded frames (`IMFSample`) and pass them to a callback function...
Hi, If you have access to the Unity project you can try adding the hl2ss plugin to it. See hl2ss_unity for an example.
The plugin depends on winrt code generated when building the app so it may be complicated (the VS solution referenced by the guide builds the app and then the plugin)....