atom icon indicating copy to clipboard operation
atom copied to clipboard

Implement Ali's RWHE calibration method

Open Kazadhum opened this issue 9 months ago • 10 comments

The idea is to implement the RWHE-Calib method for Hand-Eye calibration in ATOM.

Similarly to OpenCV, they have two different methods:

  • Hand-Eye, which solves the $AX=XB$ equation;
  • Robot-World/Hand-Eye, which solves the $AX=ZB$ equation.

Am I correct in assuming we only want the second, @miguelriemoliveira? I remember we spoke of this on Monday because of the OpenCV calibration and if I recall correctly I think we landed on just using RWHE.

By the way, ATOM has a script called "convert_from_rwhe_dataset.py". Are these scripts referring to this method?

Kazadhum avatar May 03 '24 10:05 Kazadhum

Am I correct in assuming we only want the second, @miguelriemoliveira? I remember we spoke of this on Monday because of the OpenCV calibration and if I recall correctly I think we landed on just using RWHE.

I think the hand eye should solve both variants, eye-in-hand and eye-to-world. In any case you can use Robot-World/Hand-Eye, I think its better.

I have been fighting with the opencv's method and still could not make it work.

By the way, ATOM has a script called "convert_from_rwhe_dataset.py". Are these scripts referring to this method?

Yes, we used Ali's method, but in the matlab scritp.

miguelriemoliveira avatar May 03 '24 10:05 miguelriemoliveira

Hi @miguelriemoliveira!

I think the hand eye should solve both variants, eye-in-hand and eye-to-world. In any case you can use Robot-World/Hand-Eye, I think its better.

Ok, that sounds good.

I have been fighting with the opencv's method and still could not make it work.

If you want, I could join you in Zoom after lunch to try to help. Have you made any progress or is it how we left it?

Yes, we used Ali's method, but in the matlab scritp.

Got it! These scripts will help a lot, I think.

Kazadhum avatar May 03 '24 10:05 Kazadhum

If you want, I could join you in Zoom after lunch to try to help. Have you made any progress or is it how we left it?

I changed a lot, trying to cleanup the code. Dit not help.

I will call if I have some time this afternoon.

miguelriemoliveira avatar May 03 '24 10:05 miguelriemoliveira

Ok, sounds good!

Kazadhum avatar May 03 '24 11:05 Kazadhum

Picking this back up...

Kazadhum avatar May 08 '24 14:05 Kazadhum

Possible progress! The Li method for calibration in the code base is translated to Python (hopefully correctly!). Now I need to get the comparison working so I can test if it's working properly. I can't say I understand the code exactly, but I think the translation to Python is correct.

Kazadhum avatar May 09 '24 16:05 Kazadhum

That's good, but running is the real deal.

Looking forward to see if it runs correctly.

miguelriemoliveira avatar May 09 '24 20:05 miguelriemoliveira

Hello all! I think I closed this by accident this morning...

Some notes:

  • I just found out the method I was implementing from this repository was not Ali's, but Li's, so I'll work on Ali's next (I did some debugging by running the MATLAB code with the A and B matrices hardcoded);
  • I got Li's method working for the eye-in-hand case, I'll make another script for the eye-to-hand case;

For the eye-in-hand case, running:

rosrun atom_evaluation li_eye_in_hand.py -c rgb_hand -p pattern_1 -bl base_link -hl flange -json $ATOM_DATASETS/rihbot/train_test_opencv/dataset.json -ctgt

we get:

After filtering, will use 5 collections: ['000', '001', '002', '003', '004']
Selected collection key is 000
Calculating A and B matrices for collection 000...
Calculating A and B matrices for collection 001...
Calculating A and B matrices for collection 002...
Calculating A and B matrices for collection 003...
Calculating A and B matrices for collection 004...
Ground Truth h_T_c=
[[ 0.00000000e+00  1.11022302e-16  1.00000000e+00 -2.00000000e-02]
 [-1.00000000e+00 -2.22044605e-16  1.11022302e-16  0.00000000e+00]
 [ 0.00000000e+00 -1.00000000e+00  0.00000000e+00  6.50000000e-02]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  1.00000000e+00]]
estimated h_T_c=
[[ 1.49042521e-04 -3.88510070e-04  9.99999913e-01 -1.96416458e-02]
 [-9.99999898e-01 -4.25657727e-04  1.48877146e-04 -9.81041564e-04]
 [ 4.25599850e-04 -9.99999834e-01 -3.88573471e-04  6.44773915e-02]
 [ 0.00000000e+00  0.00000000e+00  0.00000000e+00  1.00000000e+00]]
Etrans = 0.621 (mm)
Erot = 0.018 (deg)
+----------------------+-------------+---------+----------+-------------+------------+
|      Transform       | Description | Et0 [m] |  Et [m]  | Rrot0 [rad] | Erot [rad] |
+----------------------+-------------+---------+----------+-------------+------------+
| flange-rgb_hand_link |   rgb_hand  |   0.0   | 0.000382 |     0.0     |  0.000321  |
+----------------------+-------------+---------+----------+-------------+------------+

Now creating a copy of this script for the eye-to-hand case and adapting it accordingly...

Kazadhum avatar May 13 '24 14:05 Kazadhum

Working on implementing Shah's method, also from that repo. Since it is similar to Li's, the implementation should be relatively simple.

Kazadhum avatar May 15 '24 09:05 Kazadhum

Bug fixing on Li's method after testing on the (simulated) riwmpbot revealed issues

Kazadhum avatar Jun 12 '24 14:06 Kazadhum