atom
atom copied to clipboard
Check if nig is working well
@FYI @brunofavs
Related to #928
Make some sanity tests to make sure noisy initial guess (nig) flag is working.
Using nig 0 0
Using nig 0.1 0 we should have tranlation errors = 0.1 (Does not happen)
Using nig 0 0.2 we should have rotation errors = 0.2 (Does not happen)
Did some more work here.
a57f1652e2d46ae98ab6fa66850a0e45182f3b73
There is still a bug somewhere.
I reckon its inside compareAtomTransforms()
With nig 0.1 0.1
I couldn't get it to fail. It seems to always be correct, which is easy to mislead us to thinking it's correct.
However for bigger noises, the rotation fails :
nig 10 10
I did some changes to make sure the noise was being computed, and it still fails with a simpler method (rather than the matrix multiplication thing"
import tf
euler_angles_init = tf.transformations.euler_from_quaternion(transform_1['quat'])
euler_angles_final = tf.transformations.euler_from_quaternion(transform_2['quat'])
deuler = np.subtract(euler_angles_final,euler_angles_init)
rotation_error = np.linalg.norm(deuler)
EDIT : With the prior matrix method to compute the rotation error the behavior was similary wrong.
However for bigger noises, the rotation fails
bigger in rotation, translation, or both?
Is it that you are setting 10 radians for angles, which is larger that 360 degrees?
nig 10 10 is unrealistic. Does it work with nig 1 3.14?
I should've been more clear.
The bigger the noise is, more often the results are wrong. At 10 rad, it's always wrong from what I tested. What I cannot utterly understand is that for very low numbers it's accurate.
For numbers usually bigger than 2rad its inconsistent.
Runs with nig 1 3.14
:
First image both are wrong 2nd only one is correct 3rd only the other one is correct 4th both are correct.
I should add that the problem we were having leading to only the multiple
transformations being displayed correctly I solved already ( in the commit referenced in my previous comment)
It is true that 10rad is outrageous and unrealistic, but it still makes me wonder why it is happening.
While it does not seem to happen for more realistic lower values, I cannot assure that it wont happen given a big enough batch experiment. I might ve simply not ran enough runs to experience it yet .
It is true that 10rad is outrageous and unrealistic, but it still makes me wonder why it is happening.
I understand your discomfort, but perhaps we can take a more practical approach. Test it and see if it works fine until 0.5 3.14/2. If it does, lets assume the error cannot be larger than this. We set a validation on the add noise that throws an atom error saying error cannot be larger than X.
from 576b626
- Bigger noises are still bugged but now gives a atomWarn()
It's not solved but it will be a lower priority.
The issue will remain open.