PX4-Autopilot
PX4-Autopilot copied to clipboard
Draft: Vision-based target estimator (Kalman Filter)
Objectives
- Extend the current landing target estimator to:
- Have separated dynamics for moving targets.
- Fuse GNSS (vel / position) and vision.
- Estimate the full pose of the target (position + orientation)
- Create a general Kalman Filter framework in which it is simple to include other observations e.g. uwb, irlock
- Make it simple to use the Kalman Filter for other purposes than precision landing e.g. precision delivery
- Be robust to temporary vision losses: use the GNSS position of the target (or the mission's landing position) and estimated the GNSS bias using vision
- Update the precision landing algorithm to also control the orientation if enabled
TODO
- [x] If target is static, use info in EKF
- [x] Log relevant topics
- [x] Simplify logic to start estimators
- [x] Handle z direction in landing_target_pose msg (e.g. .vz_rel)
- [x] Log
test_ratio
inestimator_aid_source3d_s
(from Kalman filtersbeta = _innov / _innov_cov * _innov
) - [x] Reduce flash size, overflow for fmu_v5x and fmu_v6*
- [x] Include orientation in precision landing. Linked to https://github.com/PX4/PX4-Autopilot/pull/20487
- [x] When a GNSS observation is available and we get a secondary source of position measurement, only reset the position and bias part of the filter. Keep the same velocity estimation
- [x] Handle moving targets
- [x] Simplify estimator initialization
- [x] Require a first velocity estimate to start the filter (can come from local_pose)
- [x] Refactor start/stop/set estimator to simplify the use of the target estimator for different tasks.
- [x] Test in simulation
- [x] Flight test
- [x] Refactor orientation filter to follow position filter (with augmented state)
- [ ] Change logic for fusing GPS relative velocity
- [x] Rename files, no need to specify
decoupled
anymore - [ ] Review code
- [ ] Solve stack issues (remove modules in custom builds if necessary)
- [ ] Update PX4 documentation
- [ ] Once merged, the TARGET_RELATIVE and TARGET_ABSOLUTE MAVLINK messages must be moved from
development.xml
tocommon.xml
. Then update the custom build images and removeCONFIG_MAVLINK_DIALECT="development"
- [ ] Further flight test on moving targets
- [ ] Place a GNSS on the target in real life. (Could be done by someone who uses the follow-me module)
Changelog Entry
- TO BE FILLED
Context
Linked to PR https://github.com/PX4/PX4-Autopilot/pull/20653
The PR has been flight tested and is ready to be reviewed @dagar @bresch @potaito. Note that due to stack size limitations, it was necessary to create custom build images px4_fmu-v5_visionTargetEst
which do not contain every module (e.g. the previous landing target estimator is not included in the build). To reduce the stack size, the vision target estimator only works #if !defined(CONSTRAINED_FLASH)
. Here are more details.
- How to use?
- Structure overview
- Relevant topics for log analysis
- Simulation test results
- Real flight test results
- Kalman Filter Equations
- Additional details
- TODOs
How to use?
The vision target estimator (VTE) fuses the available information in a Kalman Filter to land precisely on static and moving targets while also controlling the yaw. The VTE can be enabled by setting VTE_EN = 1
. The vision target estimator Position and Yaw control are enabled with VTE_POS_EN = 1
and VTE_YAW_EN = 1
respectively. To control the yaw, PLD_YAW_EN
must be set to 1 in the precision landing module. The sensor fusion is performed in Vehicle-carried North-East-Down frame (Vc-NED), a NED frame that moves with the UAV.
Sensors fused (controlled with VTE_AID_MASK
):
-
GNSS on target
- Send GNSS observation via the MAVLINK message TARGET_ABSOLUTE
- Only tested in Gazebo sim due to lack of hardware (requires a GNSS on the target which communicates to drone).
-
GNSS velocity
- Uses drone's main GNSS unit via topic
vehicle_gps_position
(and potentially the MAVLINK message TARGET_ABSOLUTE for moving targets) - Tested in sim and real flights.
- Uses drone's main GNSS unit via topic
-
Mission GNSS position
- Only for static targets. Uses the landing position of the mission as an observation for the target. Note that the bias between the GNSS and the Vision measurement is estimated in the KF. Therefore, as long as the vision detects a target, the landing position does not have to be precise. The goal of the GNSS bias is to be able to land precisely if the vision target is lost during the descent.
- Tested in sim and real flights.
-
Onboard vision
- Observation sent via the MAVLINK message TARGET_RELATIVE
- Tested in sim and real flights.
Static and moving targets (controlled with VTE_MODE
):
- Static: Constant position Kalman Filter model.
- Extensive (+200) real flight tests fusing Vision + GNSS mission position + GNSS velocity
- Moving: Constant acceleration Kalman Filter model.
- Less than 50 real flights, only fusing Vision + GNSS velocity (lack of hardware).
Set which task(s) the VTE should handle (controlled with VTE_TASK_MASK
)
- For now only precision landing is supported but the structure allows to easily start/stop the estimators for other tasks e.g. FollowMe, Precision takeoff, precision delivery, ...
Secondary params that can have a big influence on performance:
-
VTE_POS_NIS_THRE
,VTE_YAW_NIS_THRE
: Normalized Innovation Squared (NIS) threshold for position and orientation estimators respectively. The lower the thresholds, the more observations will be flagged as outliers. -
VTE_BTOUT
: Time after which the target is considered lost without any new position measurements. -
VTE_GPS_V_NOISE
,VTE_GPS_P_NOISE
,VTE_EVP_NOISE
: minimum allowed observation noise. If these values are too low and the observation's variance is under-estimates (too low), the system can oscillate.
Structure overview vision_target_estimator
:
-
vision_target_estimator_params
- Contains all the params for the VTE
-
VisionTargetEst
- Spawns the VTE task. @dagar @bresch note that it was set to a priority of -14 in
WorkQueueManager.hpp
-
static constexpr wq_config_t vte{"wq:vte", 6000, -14};
-
- Runs both the position and orientation filters at 50Hz (hard coded) depending on the task at hand (
VTE_TASK_MASK
) - Down samples the UAV's acceleration (input of estimators)
- Subscribes to relevant topics required for the estimators e.g.
vehicle_angular_velocity
,vehicle_local_position
,prec_land_status
, ...
- Spawns the VTE task. @dagar @bresch note that it was set to a priority of -14 in
-
Position
-
VTEPosition
- Processes sensor observations (
bool VTEPosition::processObs*
)- publishes status for each obs using the
estimator_aid_source3d_s
message field. - Note that the Normalized Innovation Squared test (NIS) is used to determine whether an observation should be fused or not. It is logged in the
test_ratio
field for each observation (e.g.vte_aid_gps_pos_mission.test_ratio
). The threshold can be defined with the param:VTE_POS_NIS_THRE
- publishes status for each obs using the
- Handles the sensor fusion of all
_target_estimator
(Base_KF_decoupled *_target_estimator[nb_directions]
) which are defined in theKF_xyzb_*
files. - Publishes
landing_target_pose
(for the prec-land module) andvision_target_est_position
which contains all the states of the position KF.
- Processes sensor observations (
-
-
Orientation
same structure asPosition
but for the Yaw sensor fusion. -
python_derivation
- To speed up the KF matrix computations, SymForce is used. This folder contains the python files for generating the code (e.g.
KF_xyzb_decoupled_static.py
) as well as the generated .h files in thegenerated
folder (e.g.predictCov.h
) -
base_KF_decoupled.h
- Contains the virtual functions used for both the static and moving KFs
-
KF_xyzb*
- Contain the Kalman Filter equations
- To speed up the KF matrix computations, SymForce is used. This folder contains the python files for generating the code (e.g.
-
documentation
- Contains a PDF explaining the deriving the Kalman Filter models and the explaining the design choices.
Relevant topics for log analysis:
-
vision_target_est_position
,vision_target_est_orientation
: Make sure that the states of the KF are coherent. E.g. variance must increase when we lose one observation. -
vte_acc_input
: make sure that the acceleration is properly down sampled -
vte_aid_*
: Checkinnovation
(should resemble white noise),test_ratio
should be large in case of outlier,observation_variance
, ...
Simulation results
Performed in Gazebo simulation using: https://github.com/PX4/PX4-SITL_gazebo-classic/pull/950. A test is valid if the logs are coherent (checks in section above) and the drone lands on the target. Mainly tested different combinations of sensors, tuning params, the behavior of the KF in case of high sensor noise, and outlier observations.
-
Static target Position + Yaw:
- [X] Vision
- [X] Mission GNSS position
- [X] Target GNSS
- [X] Vision + Target GNSS
- [X] Vision + Mission GNSS
- [X] All the above + GNSS velocity
-
Moving target Position:
- [X] Vision
- [X] Mission GNSS position
- [X] Target GNSS
- [X] Vision + Target GNSS
- [X] Vision + Mission GNSS
- [X] All the above + GNSS velocity
Flight Tests
Static target Position + Yaw control
-
Representative logs:
- https://logs.px4.io/plot_app?log=5fc565d6-d3ec-402d-9bc5-fe4a42ddc6b5
- https://logs.px4.io/plot_app?log=cfffc174-4260-4bae-9212-b72487a1520f
-
Results: the drone consistently landed on a small table without falling (i.e. below a 20cm error).
- [X] Vision
- [X] Vision + Mission GNSS
- [X] Vision + Mission GNSS + GNSS velocity
https://github.com/PX4/PX4-Autopilot/assets/74473718/4b2f6661-b7f4-486b-8b8a-a2c2ba995ac1
- Moving target Position: Less tested!
- [X] Vision + GNSS velocity only
The figure below shows the vision observations (green), the GNSS observations (red), the KF output (blue), and the GNSS bias in the bottom plot. The output of the estimator is smoother than the vision observation. The GNSS bias allows to fuse both vision and GNSS despite having a GNSS mission position offset w.r.t, the target.
The figure below shows the innovations of the Position target estimator for vision and GNSS observations. They resemble white noise indicating good filter performances.
The figure below shows an example of outlier detection using the NIS threshold during a real flight. 71 seconds into the flight, the vision observation is a clear outlier (blue curve in top subplot). As expected, this is caught in the NIS check and the measurement is not fused (see two bottom plots). As a consequence, the KF's yaw estimation (green curve in top subplot) remains consistent despite the outlier measurements.
Kalman Filter Equations:
Static target
Moving target
Implementation details:
Coupled vs Decoupled.
Process GNSS obs.
Time delays
Moving targets param tuning
-
VTE_MOVING_T_MAX
corresponds to K_MAX -
VTE_MOVING_T_MIN
corresponds to K_MIN -
MPC_Z_V_AUTO_DN
is used for the expected downwards velocity.
Thanks for splitting this out, it should make it much easier and safer to bring it in. I'll review the overall structure.
First for a bit of bikeshedding, instead of VTEST
can we try to find something more self descriptive that doesn't sound like it's for testing?
- vision target estimator => VTARG_EST, VTARG, VIS_TARG?
The PR has been flight tested and is ready to be reviewed @dagar @bresch @potaito. Note that due to stack size limitations, it was necessary to create custom build images px4_fmu-v5_visionTargetEst
which do not contain every module (e.g. the previous landing target estimator is not included in the build). To reduce the stack size, the vision target estimator only works #if !defined(CONSTRAINED_FLASH)
. Here are more details.
- How to use?
- Structure overview
- Relevant topics for log analysis
- Simulation test results
- Real flight test results
- Kalman Filter Equations
- Additional details
- TODOs
How to use?
The vision target estimator (VTE) fuses the available information in a Kalman Filter to land precisely on static and moving targets while also controlling the yaw. The VTE can be enabled by setting VTE_EN = 1
. The vision target estimator Position and Yaw control are enabled with VTE_POS_EN = 1
and VTE_YAW_EN = 1
respectively. To control the yaw, PLD_YAW_EN
must be set to 1 in the precision landing module. The sensor fusion is performed in Vehicle-carried North-East-Down frame (Vc-NED), a NED frame that moves with the UAV.
Sensors fused (controlled with VTE_AID_MASK
):
-
GNSS on target
- Send GNSS observation via the MAVLINK message TARGET_ABSOLUTE
- Only tested in Gazebo sim due to lack of hardware (requires a GNSS on the target which communicates to drone).
-
GNSS velocity
- Uses drone's main GNSS unit via topic
vehicle_gps_position
(and potentially the MAVLINK message TARGET_ABSOLUTE for moving targets) - Tested in sim and real flights.
- Uses drone's main GNSS unit via topic
-
Mission GNSS position
- Only for static targets. Uses the landing position of the mission as an observation for the target. Note that the bias between the GNSS and the Vision measurement is estimated in the KF. Therefore, as long as the vision detects a target, the landing position does not have to be precise. The goal of the GNSS bias is to be able to land precisely if the vision target is lost during the descent.
- Tested in sim and real flights.
-
Onboard vision
- Observation sent via the MAVLINK message TARGET_RELATIVE
- Tested in sim and real flights.
Static and moving targets (controlled with VTE_MODE
):
- Static: Constant position Kalman Filter model.
- Extensive (+200) real flight tests fusing Vision + GNSS mission position + GNSS velocity
- Moving: Constant acceleration Kalman Filter model.
- Less than 50 real flights, only fusing Vision + GNSS velocity (lack of hardware).
Set which task(s) the VTE should handle (controlled with VTE_TASK_MASK
)
- For now only precision landing is supported but the structure allows to easily start/stop the estimators for other tasks e.g. FollowMe, Precision takeoff, precision delivery, ...
Secondary params that can have a big influence on performance:
-
VTE_POS_NIS_THRE
,VTE_YAW_NIS_THRE
: Normalized Innovation Squared (NIS) threshold for position and orientation estimators respectively. The lower the thresholds, the more observations will be flagged as outliers. -
VTE_BTOUT
: Time after which the target is considered lost without any new position measurements. -
VTE_GPS_V_NOISE
,VTE_GPS_P_NOISE
,VTE_EVP_NOISE
: minimum allowed observation noise. If these values are too low and the observation's variance is under-estimates (too low), the system can oscillate.
Structure overview vision_target_estimator
:
-
vision_target_estimator_params
- Contains all the params for the VTE
-
VisionTargetEst
- Spawns the VTE task. @dagar @bresch note that it was set to a priority of -14 in
WorkQueueManager.hpp
-
static constexpr wq_config_t vte{"wq:vte", 6000, -14};
-
- Runs both the position and orientation filters at 50Hz (hard coded) depending on the task at hand (
VTE_TASK_MASK
) - Down samples the UAV's acceleration (input of estimators)
- Subscribes to relevant topics required for the estimators e.g.
vehicle_angular_velocity
,vehicle_local_position
,prec_land_status
, ...
- Spawns the VTE task. @dagar @bresch note that it was set to a priority of -14 in
-
Position
-
VTEPosition
- Processes sensor observations (
bool VTEPosition::processObs*
)- publishes status for each obs using the
estimator_aid_source3d_s
message field. - Note that the Normalized Innovation Squared test (NIS) is used to determine whether an observation should be fused or not. It is logged in the
test_ratio
field for each observation (e.g.vte_aid_gps_pos_mission.test_ratio
). The threshold can be defined with the param:VTE_POS_NIS_THRE
- publishes status for each obs using the
- Handles the sensor fusion of all
_target_estimator
(Base_KF_decoupled *_target_estimator[nb_directions]
) which are defined in theKF_xyzb_*
files. - Publishes
landing_target_pose
(for the prec-land module) andvision_target_est_position
which contains all the states of the position KF.
- Processes sensor observations (
-
-
Orientation
same structure asPosition
but for the Yaw sensor fusion. -
python_derivation
- To speed up the KF matrix computations, SymForce is used. This folder contains the python files for generating the code (e.g.
KF_xyzb_decoupled_static.py
) as well as the generated .h files in thegenerated
folder (e.g.predictCov.h
) -
base_KF_decoupled.h
- Contains the virtual functions used for both the static and moving KFs
-
KF_xyzb*
- Contain the Kalman Filter equations
- To speed up the KF matrix computations, SymForce is used. This folder contains the python files for generating the code (e.g.
-
documentation
- Contains a PDF explaining the deriving the Kalman Filter models and the explaining the design choices.
Relevant topics for log analysis:
-
vision_target_est_position
,vision_target_est_orientation
: Make sure that the states of the KF are coherent. E.g. variance must increase when we lose one observation. -
vte_acc_input
: make sure that the acceleration is properly down sampled -
vte_aid_*
: Checkinnovation
(should resemble white noise),test_ratio
should be large in case of outlier,observation_variance
, ...
Simulation results
Performed in Gazebo simulation using: https://github.com/PX4/PX4-SITL_gazebo-classic/pull/950. A test is valid if the logs are coherent (checks in section above) and the drone lands on the target. Mainly tested different combinations of sensors, tuning params, the behavior of the KF in case of high sensor noise, and outlier observations.
-
Static target Position + Yaw:
- [X] Vision
- [X] Mission GNSS position
- [X] Target GNSS
- [X] Vision + Target GNSS
- [X] Vision + Mission GNSS
- [X] All the above + GNSS velocity
-
Moving target Position:
- [X] Vision
- [X] Mission GNSS position
- [X] Target GNSS
- [X] Vision + Target GNSS
- [X] Vision + Mission GNSS
- [X] All the above + GNSS velocity
Flight Tests
Static target Position + Yaw control
-
Representative logs:
- https://logs.px4.io/plot_app?log=5fc565d6-d3ec-402d-9bc5-fe4a42ddc6b5
- https://logs.px4.io/plot_app?log=cfffc174-4260-4bae-9212-b72487a1520f
-
Results: the drone consistently landed on a small table without falling (i.e. below a 20cm error).
- [X] Vision
- [X] Vision + Mission GNSS
- [X] Vision + Mission GNSS + GNSS velocity
https://github.com/PX4/PX4-Autopilot/assets/74473718/4b2f6661-b7f4-486b-8b8a-a2c2ba995ac1
- Moving target Position: Less tested!
- [X] Vision + GNSS velocity only
The figure below shows the vision observations (green), the GNSS observations (red), the KF output (blue), and the GNSS bias in the bottom plot. The output of the estimator is smoother than the vision observation. The GNSS bias allows to fuse both vision and GNSS despite having a GNSS mission position offset w.r.t, the target.
The figure below shows the innovations of the Position target estimator for vision and GNSS observations. They resemble white noise indicating good filter performances.
The figure below shows an example of outlier detection using the NIS threshold during a real flight. 71 seconds into the flight, the vision observation is a clear outlier (blue curve in top subplot). As expected, this is caught in the NIS check and the measurement is not fused (see two bottom plots). As a consequence, the KF's yaw estimation (green curve in top subplot) remains consistent despite the outlier measurements.
Kalman Filter Equations:
Static target
Moving target
Implementation details:
Coupled vs Decoupled.
Process GNSS obs.
Time delays
Moving targets param tuning
-
VTE_MOVING_T_MAX
corresponds to K_MAX -
VTE_MOVING_T_MIN
corresponds to K_MIN -
MPC_Z_V_AUTO_DN
is used for the expected downwards velocity.
TODOs
- [ ] Review code
- [ ] Solve stack issues (remove modules in custom builds if necessary)
- [ ] Update PX4 documentation
- [ ] Once merged, the TARGET_RELATIVE and TARGET_ABSOLUTE MAVLINK messages must be moved from
development.xml
tocommon.xml
. Then update the custom build images and removeCONFIG_MAVLINK_DIALECT="development"
- [ ] Further flight test on moving targets
- [ ] Place a GNSS on the target in real life. (Could be done by someone who uses the follow-me module)
Now that we don't have coupled filters, we can refactor the Base KF class to simplify the code and improve the virtual class inheritance (commit https://github.com/PX4/PX4-Autopilot/pull/22008/commits/53b45be8e28c1f35870dbc7e2a0f8ab3f509d0ac).
Main changes:
- In
VTEPosition.cpp
use anAugmentedState
defined aspos_rel
,vel_uav
,bias
,acc_target
,vel_target
(which corresponds to the state of the moving target filter).- Each filter has its own state and handles the received
AugmentedState
accordingly:-
Static filter:
pos_rel
,vel_rel = -vel_uav
,bias
-
Moving filter:
State = AugmentedState
-
Static filter:
- This allows to drastically reduce the need for discriminating between filters (
if(_target_mode == TargetMode::Stationary){...}
) inVTEPosition.cpp
and thus simplifies the code. - It also clarifies the initialization of the filters and the publication of the targets. Previously, the initial velocity in
VTEPosition.cpp
depended on the target mode: If the target was static:vel_init = vel_rel
and if the target was moving:vel_init = uav_vel
. Now we follow the augmented state inVTEPosition.cpp
, therefore,vel_init
is always equal tovel_uav
. The static filter transforms the augmented state to its own statevel_rel = -vel_uav
. Similarly, the filter's getters return the augmented state e.g. the static filter transforms its state back to the augmented state. - No more unused functions in
KF_decoupled_static
inherited frombase_KF_decoupled.h
. Instead of havingsetPosition(pos_init), setVel(vel_init), ...
we now have a single functionsetState(augmented_state)
. The static filter simply ignores the fields it does not need.
- Each filter has its own state and handles the received
- No need to do Chlesky decompositions anymore, therefore the measurement matrix can be 3x5 instead of 3x15 because the x,y,z directions are fully decoupled.
- Use of enums when accessing vectors to clarify the code.
Create a general Kalman Filter framework in which it is simple to include other observations e.g. uwb, irlock
If this PR is merged, does it mean I can fly indoors (using UWB).Use UWB data to obtain local position estimation
Create a general Kalman Filter framework in which it is simple to include other observations e.g. uwb, irlock
If this PR is merged, does it mean I can fly indoors (using UWB). Use UWB data to obtain local position estimation
Hi @xdwgood, this PR is for a target estimation. It does not change the Extended Kalman filter (EKF) which estimates the local position. However, we do use the target estimation to provide an additional velocity estimate in the EKF which can help keep a local position estimate.
This would be very useful! Any idea on timeline for merge?
This would be very useful! Any idea on timeline for merge?
Hopefully in the next couple of months @AMoldskred. I will have time to continue working on the final steps starting from August.