ALVR
ALVR copied to clipboard
[Custom client] Implementation of my own client for iOS
I trying to implement my own client for iOS, but without good documentation it's so hard to understand how it's all should works.
I just have some questions:
- What angle values are considered normal for
alvr_send_viewsconfig? - Why am I receiving such a strange picture? The first thing that catches your eye is the strange intersection of planes in the center (partially fixed by reducing the size of half the screen). Well, the second, more critical thing is that the eye cameras are located in very strange places.
My views configs code: (based on alvr-visionpro repo)
let v: Float = 1.1
let v2: Float = 1.1
let leftAngles = atan(simd_float4(v, v, v, v))
let rightAngles = atan(simd_float4(v2, v2, v2, v2))
let leftFov = AlvrFov(left: -leftAngles.x, right: leftAngles.y, up: leftAngles.z, down: -leftAngles.w)
let rightFov = AlvrFov(left: -rightAngles.x, right: rightAngles.y, up: rightAngles.z, down: -rightAngles.w)
let fovs = [leftFov, rightFov]
let ipd = Float(0.063) // Magic value from alvr-visionpro repo
alvr_send_views_config(fovs, ipd)
What am I doing wrong?
For iPhones ?
@ShootingKing-AM Yes
Maybe you can take a look at PhoneVR codebase ? It uses alvr_client_core + cardboard api.
I also wanted to extend PhoneVR support to iPhones. Maybe we can use some platform agnostic library (flutter or something?) to serve VR on both android and I phones?
relates to: https://github.com/alvr-org/PhoneVR/issues/100
@ShootingKing-AM Yes, I already use codebase of PhoneVR as examples of implementation. But i have some specific problems with cardboard API. (yet) In common I just want to understand how it's all should work.
About integration to PhoneVR iOS support... We can use Kotlin Multiplatform but I think this is not quick to implement. :c Maybe we can implement app abstraction in C/C++?
What angle values are considered normal for alvr_send_views config?
alvr_send_views takes in the field of view in radians: for example, in the Android client,
https://github.com/alvr-org/ALVR/blob/073d16e2a5fa02e2970a37982dde5db9709006c5/alvr/client_openxr/src/lib.rs#L122
There are sample field of view values for other headsets online, e.g. Quest 2 - see the Left eye head FOV section. (note that that page has the angles in degrees instead of radians though)
For Cardboard, you get the field of view values via CardboardLensDistortion_getFieldOfView https://developers.google.com/cardboard/reference/c/group/lens-distortion#cardboardlensdistortion_getfieldofview
let ipd = Float(0.063) // Magic value from alvr-visionpro repo
This is the interpupillary distance (distance between eyes); 63mm is used as the default here so I chose it here as well. I believe Cardboard doesn't offer the IPD directly, but outputs them in the head to eyes translation matrix (https://github.com/googlevr/cardboard/blob/c8842698f4a9d63cce865e7d6cb75773a4673496/sdk/lens_distortion.cc#L40, CardboardLensDistortion_getEyeFromHeadMatrix).
I think your current alvr_send_views is reasonable: that corresponds to about a 95 degree FoV (-48deg left, 48deg right).
If I remember correctly, you get the weird picture if you don't send tracking: are you sure you're sending tracking correctly? Looking at your repo, you're only sending tracking on ALVR_EVENT_FRAME_READY; in alvr-visionos that was only there for testing.
You should be sending tracking continuously:
Android client sends this at 3x the framerate: https://github.com/alvr-org/ALVR/blob/073d16e2a5fa02e2970a37982dde5db9709006c5/alvr/client_openxr/src/lib.rs#L559 Mock client does the same: https://github.com/alvr-org/ALVR/blob/073d16e2a5fa02e2970a37982dde5db9709006c5/alvr/client_mock/src/main.rs#L180 visionOS client sends this per frame, I think? https://github.com/alvr-org/alvr-visionos/blob/9d99df0c0848b816bc43dd5ebc49151367389935/ALVRClient/Renderer.swift#L450
@ShootingKing-AM Yes, I already use codebase of PhoneVR as examples of implementation. But i have some specific problems with cardboard API. (yet) In common I just want to understand how it's all should work.
About integration to PhoneVR iOS support... We can use Kotlin Multiplatform but I think this is not quick to implement. :c Maybe we can implement app abstraction in C/C++?
I forgot to reply to this. Yeah sure. My intent was to reuse as much code as possible from PhoneVR and merge ios (iphones) support into PVR (since there was a request for ios support there too).
@zhuowei Thanks for your answer. Comprehensive information. And about "weird picture": It's just adjusted frames for foveation encoding, i just disable it (you can do it in newest versions of alvr-server). And in my repo i have foreation-feature branch (but it's still be broken for some reason) to implemet this feature.
@ShootingKing-AM And about integrating to your project... As i see it's needed a lot of work to integrate iOS support.
This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
I have plans to create a iOS client for debugging purposes. I will start working on it as my next task