low tracking quality
Hello,
I'm writing a basic ros2 wrapper around libsurvive (all it does is scrapes the poses of the headsets/controllers and broadcasts them as transforms) and I'm getting very inconsistent tracking quality and I'm constantly hit with messages like:
Warning: Disambiguator got lost at 2819285738; refinding state for WM0 Info: Locked onto state 5( 6, 634218) at 2821685058 for WM0
or
Warning: Too many failures for WM1 at 260.804865; reseting calibration 0.000000e+00 ( 6.5233 stationary)
or libsurvive crashing because the solver fails to converge
these issues are accompanied with the pose's of the controllers/headset in my scene moving very erratically (zooming off into space in Rviz, etc). Ive also noticed that once this happens I need to delete the calibration data in order to get the controllers to behave again.
Is this a sign that my room setup is subpar? or should I be using a different solver (follow up question would be, how do I select the solver when I launch libsurvive)? Do you have any advice on how to tell if the light or reflective objects in my space could be contributing to my issues? I'm fairly sure its not how I'm writing my wrapper because I get the same issues with Andrew Symington's ros2 wrapper as well.
https://github.com/user-attachments/assets/d627c4a8-f853-4339-9bab-1821c0760caf
Some things I've tried:
- moving to another room (no improvement)
- running libsurvive outside of my projects docker container. Doing this caused the controller/headset poses to be MUCH more stable.
Is there some issue related to running libsurvive in a docker container that could cause issues?
any advice on bug squashing would be really appreciated. Ive attached a video of what the behavior looks like in rviz when I run Libsurvive inside my project's docker container and I move a controller.
just making sure you have seen it, there is already some ros code in https://github.com/collabora/libsurvive/tree/master/tools/ros_publisher
Then I'm not sure if that's actually what your problem is but the global scene online calibration never worked well for me and was more likely to mess up a good calibration, so usually I'd
- Delete the old calibration in ~/.config/libsurvive/config.json
- run survive_cli and calibrate by putting the device(s) in a few different places
- Run the actual app I want to run with the environment variable SURVIVE_GLOBALSCENESOLVER=0 (should also be possible to set in config.json somehow)
I feel like tracking has been better when I have 2 basestations in front of me, one left and one right, rather than on opposite sides of the room. But it could also be a placebo. Also tracking generally doesn't work great if the device is very close to a basestation (<20 cm).
Generally the tracking quality isn't great but should be "ok". Improvements always welcome.
Thanks Christoph I appreciate your response. Unfortunately that code you linked to is written for ROS 1, and I'm using ROS 2, but I appreciate it anyway.
Have you found more consistent behavior with the other posers besides MPFIT? I cant verify this because I cant specify launch arguments inside the wrapper I am writing (yet) but have you noticed any differences when using the different solvers?
wow moving both of the base stations to the same side of the room has made a HUGE difference! thanks for the advice!!
in my experience it is crucial to avoid hard reflecting surfaces in the line-of-sight of the lighthouses. whenever i am in front of a window, this butterfly-like movement behaviour shows up like in your video. i also banned chromed furniture elements from my room. these really mess with the tracking.
do you have reflecting objects in your room that are now out of sight for the lighthouses?
I experimented with tracking in my setup without any reflective surfaces. SteamVR works fine in my setup while libsurvive has tracking issues even with SteamVR calibration. My observations suggest that the Kalman filter is somehow parametrized so that it amplifies noise instead of reducing it. For example, ignoring IMU data seems to improve tracking and reduces sudden jerks. What comes from optical sensors looks perfect to me, but it gets worse while filtering.
My current observation is, setting IMU movement thresholds to zero improves things. I believe using thresholds is bad idea anyway since the filter then neglects significant data just below the threshold. But the tracking is still not good enough for battles in HL:Alyx while the game as such is mostly playable.
Thanks for the advice. I guess I should clarify what we mean by "reflective". I have a Doosan robot arm made of steel with an unpolished finish and a few glossy plastic surfaces. Would these things create reflections that cause tracking error?
You may try to cover them or to look at them using any low-end camera. Low-end cameras usually see IR as blue so you can verify if you can see a reflection of the lighthouse. From my experience the reflective surface should be big like a wall mirror or a window in order to produce significant tracking issues. A picture on the wall high enough may cause problems too. Things on the desk are usually too low to cause issues at the normal height but a polished desk itself could be problem. The lighthouse beam has to be systematically reflected in the direction of the headset, so anything that reflects in other directions is of no concern.