calibration_reader.py fails on OAK-D (likely early model with no rgb cam cali in EEPROM)
Running calibration_reader.py on my OAK-D in Windows fails immediately with
Traceback (most recent call last):
File "calibration_reader.py", line 17, in <module>
M_rgb, width, height = calibData.getDefaultIntrinsics(dai.CameraBoardSocket.RGB)
RuntimeError: There is no Intrinsic matrix available for the the requested cameraID
Setup
- Microsoft Windows [Version 10.0.19044.1348]
- Python 3.8.10
depthai-pythonat 595c57aaf812e22a0f3ebbe63669e81683585b17- Setup steps from https://docs.luxonis.com/projects/api/en/latest/samples/calibration/calibration_reader/#setup
Repro
- Run
python3 calibration_reader.py
Result
Traceback (most recent call last):
File "calibration_reader.py", line 17, in <module>
M_rgb, width, height = calibData.getDefaultIntrinsics(dai.CameraBoardSocket.RGB)
RuntimeError: There is no Intrinsic matrix available for the the requested cameraID
Expected
No errors and output of calibration data. If there is no RGB calibration, then report such and continue to output calibration data for the other cameras.
Notes
I think my OAK-D is an earlier model and I heard rumor some of them did not get RGB camera calibration saved to EEPROM at the factory. Its device name is 14442C10C1A1C6D200-ma2480
Noticed...lines 17 and 23 are duplicated.
If I remove one dup and wrap it with a try/except, then I get more output. But there is a data error.
error getting RGB camera intrinsics
LEFT Camera resized intrinsics...
[[855.60162354 0. 640.66516113]
[ 0. 856.22357178 365.36651611]
[ 0. 0. 1. ]]
LEFT Distortion Coefficients...
k1: -4.738511085510254
k2: 14.584341049194336
p1: 0.0013627043226733804
p2: 0.00028065513470210135
k3: -13.716522216796875
k4: -4.795283317565918
k5: 14.808625221252441
k6: -13.929564476013184
s1: 0.0
s2: 0.0
s3: 0.0
s4: 0.0
τx: 0.0
τy: 0.0
RIGHT Camera resized intrinsics...
[[855.84124756 0. 639.67047119]
[ 0. 856.5279541 368.2331543 ]
[ 0. 0. 1. ]]
RIGHT Distortion Coefficients...
k1: -5.895343780517578
k2: 20.52387809753418
p1: 0.0006348574534058571
p2: -0.0006049301591701806
k3: -23.919910430908203
k4: -5.9453325271606445
k5: 20.72853660583496
k6: -24.12788200378418
s1: 0.0
s2: 0.0
s3: 0.0
s4: 0.0
τx: 0.0
τy: 0.0
RGB FOV 68.7938003540039, Mono FOV 71.86000061035156
LEFT Camera stereo rectification matrix...
[[ 9.95891940e-01 -7.44438190e-03 9.31839077e+00]
[ 4.79700486e-03 1.00023902e+00 -8.49013926e-02]
[-6.79202980e-06 -2.44090900e-07 1.00442367e+00]]
RIGHT Camera stereo rectification matrix...
[[ 9.95613104e-01 -7.44173641e-03 1.05077240e+01]
[ 4.79566177e-03 9.99883566e-01 -2.81570466e+00]
[-6.79012812e-06 -2.44004158e-07 1.00441637e+00]]
Transformation matrix of where left Camera is W.R.T right Camera's optical center
[[ 9.99635756e-01 -2.55835019e-02 8.59151501e-03 -7.46199036e+00]
[ 2.55791750e-02 9.99672592e-01 6.13274460e-04 -1.36521935e-01]
[-8.60439241e-03 -3.93287191e-04 9.99962926e-01 2.08357461e-02]
[ 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00]]
Transformation matrix of where left Camera is W.R.T RGB Camera's optical center
[[ 0. 0. 0. -7.46199036]
[ 0. 0. 0. -0.13652194]
[ 0. 0. 0. 0.02083575]
[ 0. 0. 0. 1. ]]
The first line is the debug print I put in the try/except wrap.
Look at the last dataset Transformation matrix of where left Camera is W.R.T RGB Camera's optical center
The right-most column is a translation and that translate is definitely wrong.
Instead, the API reports no problem and quietly duplicates the translate from the dataset directly before (easy to see as 7.5cm is approx distance left->right).
I suspect this is related the absence of RGB calibration data and not handling that correctly.
l_rgb_extrinsics = np.array(calibData.getCameraExtrinsics(dai.CameraBoardSocket.LEFT, dai.CameraBoardSocket.RGB))
I believe that API should throw a RuntimeError just like calibData.getDefaultIntrinsics(dai.CameraBoardSocket.RGB) already does.
Thanks and sorry for the trouble. Adding @saching13 to help.
@saching13, if you want me to validate any fixes, I can not run calibration.py because it will overwrite the EEPROM with a full set of cali data. Please tell me if you want that.
Otherwise, I will likely update the calibration sometime in the next two weeks and the repo scenario will be forever lost on my OAK-D.
Hello @diablodale, Thanks alot for the feedback. On the rgb not being calibrated can you share when was the device bought?
As of adding a Try/Catch the scripts needs more than just a try-catch. For ex: when we run the script with OAK-1 it will throw a run time error when trying to fetch the LEFT/RIGHT cameras intrinsics or any extrinsics. So I need to run these commands based on which device is connected followed by getting only those intrinsics along with try-catch. I had backlogged this task as of now. Will work on it.
Look at the last dataset Transformation matrix of where left Camera is W.R.T RGB Camera's optical center The right-most column is a translation and that translate is definitely wrong.
This is weird. It should throw an error. I will cross-check this and add a fix for this. Thanks for finding.
I bought the "OAK-D Campaign Edition" and my pledge was collected 13 Aug 2020.
I also am thinking how I will detect/manage different OAK device models with my C++ app. My first thinking was use the list of board models and have a mapping model->feature(s). But now my thinking is to probe for features (rbg, mono(s), focus, poe, etc.) and then I don't have an fixed-in-time list that is outdated at the first new OAK model, or extra work to manage/distribute a dynamic list.
Do you want me to wait on your fixes to test them? Or can I update my cali in EEPROM this month?
I also am thinking how I will detect/manage different OAK device models with my C++ app if you mean detecting multiple OAK Devices then you can use
getAllAvailableDevices()function here. If not can you elaborate ?
Do you want me to wait on your fixes to test them? Or can I update my cali in EEPROM this month?
If you don't have any specific use of RGB calibration information and don't need to align the depth to RGB camera you can just use the device as it is without any issues. And if you need RGB calibration and DepthAlignment you can follow this and recalibrate the device.
I bought the "OAK-D Campaign Edition" and my pledge was collected 13 Aug 2020. Can I know which month it was delivered instead?
Hi. I'm not being clear enough. I'll try again...
Today this issue reproduces on my OAK-D hardware.
If I run calibration.py...that will CHANGE FOREVER my hardware device. The EEPROM on it will be reprogrammed.
This issue may (or may not) reproduce on my changed OAK-D and the new EEPROM data.
My question to you is... Do you want my OAK-D for testing? If you say "no", then I will reprogram the EEPROM and I can NOT help you test your fixes. If you say "yes", then i will delay reprogram of my EEPROM. I can help you test.
I also am thinking how I will detect/manage different OAK device models with my C++ app
My app needs to support backward and forward OAK devices as best possible. For example, I don't want to code my app to support models BW1098OAK and BW1093. That is too restrictive. When the OAK-D-Lite is released, my app could not support it because it is not in that list of two models. I have to recompile and redistribute. 👎
Instead, I want to probe for features. I don't care what model device it is. Probe for RGB camera Probe for IMU Probe for Mono left Probe for Mono right etc.
If I use a feature probing approach, then my app works on OAK-1 and OAK-D. And it will work on OAK-D-Lite when that ships. The OAK-D-Lite doesn't have the same resolution and doesn't have an IMU. But my app will work because I can probe for those features, get an error, and then disable those features in my app. I have not tried this yet. 🤔
This probing approach is related to this python script. Do you want the burden to keep it updated for new models? That seems to me more work than needed. If instead you probed for features, you can then display the calibration data for the features the device has.
Do you want my OAK-D for testing? No. that is fine. I can recreate the issue.
And as of OAK-D-Lite you can get which camera it is using getConnectedCameraProperties() which will be added to the main branch later and another option is by determining which device it is using board name stored in the EEPROM.
In this script we are not probing any of the cameras. I am just using the camera socket as a ID to get the camera intrinsics and extrinsics from the EEPROM which is central to the device.
@diablodale regarding "probing" this is something we intend to add soon as part of getting connected device (before booting actual FW) Right now, one can probe only by connecting to a device (which in your case might be adequate) with use of RPC functions to get connected cameras, its types, etc... (more functions to be exposed easily).
The change we intend to do is to be able to detect the actual board being used before hand, which the host will then recognize and send additional details along (for FW to know what the board is capable of. Allows for unified FW, which is configured in this way). The host will be able to know upfront (when getting connected devices) about what board it is.
I can write up a use case, and/or if you have a spec for that API(s) happy to review. Help me help you help me? ;-)
Sounds good - I'll create an issue to discuss public part of this endeavor, and we can continue there:)
@diablodale Created the first draft - https://github.com/luxonis/depthai-core/issues/285 Just as a scope and general idea of why & how behind it