pyboreas
pyboreas copied to clipboard
Strange shaking phenomenon while applying polar to cartesian for radar object
Hi, thanks for releasing this awesome Boreas Dataset.
I have recently attempted visualization using the code provided below. The attached video demonstrates the output result. I am puzzled by the persistent shaking of the radar image. It seems that I may have overlooked some crucial details, but I am uncertain about what they could be. I would greatly appreciate your assistance in addressing this issue. Thank you very much.
import matplotlib.pyplot as plt
import matplotlib.transforms as mtf
import numpy as np
import cv2
from pyboreas import BoreasDataset
from pyboreas.data.splits import obj_train, obj_test, obj_sample
from pyboreas.utils.utils import get_inverse_tf
bd = BoreasDataset('/data/Boreas/', split=obj_train)
seq = bd.sequences[10] ## ** Try different sequences!
seq.filter_frames_gt()
#seq.synchronize_frames('lidar')
#print(seq.radar_frames)
bounds = [-75, 75, -75, 75, -5, 10]
index = 4 ## ** Try different indices in the sequence!
for index in range(len(seq.radar_frames)):
rad = seq.get_radar(index)
radar_cart = rad.polar_to_cart(0.25, 1000) # from polar to cartesian
radar_cart = cv2.addWeighted(radar_cart, 1, radar_cart, 1, 0) # use to enhance the brightness
cv2.imshow("radar", radar_cart)
cv2.waitKey(10)
https://github.com/utiasASRL/pyboreas/assets/88025855/241bf756-4f6e-4a98-a496-bb047a8fede2
Hmm, I think the issue is that the polar_to_cart
function is trying to work with the azimuth values stored in the polar image files. However, I remember that these were slightly corrupted during the boreas-objects-v1
sequence due to the images being compressed using a lossy compression. I have since replaced the lossy compression with a lossless image format.
For all the sequences other than boreas-objects-v1
, this shouldn't be a problem.
Try using this function directly and setting fix_wobble
to False
:
https://github.com/utiasASRL/pyboreas/blob/0a687fa66387e24aea6335fd04fb242f7aece774/pyboreas/utils/radar.py#L59C5-L59C29
For a better fix, you could try and fit a linear slope to the azimuth value vs. row index and then use the corrected azimuth values in the polar_to_cart function.
I have some code for correcting timestamps that you may find useful:
import numpy as np
from sklearn.linear_model import RANSACRegressor as RANSAC
# d can be 8, 16, 32, 64, ...
def convert_to_byte_array(t, d=64):
byte_array = np.zeros(d, dtype=np.uint8)
for i in range(d-1, -1, -1):
k = t >> i
if (k & 1):
byte_array[i] = 1
else:
byte_array[i] = 0
out_array = np.zeros(d // 8, dtype=np.uint8)
for i in range(d // 8):
for j in range(8):
out_array[i] += byte_array[i * 8 + j] * 2**j
return out_array
def get_time_fit(times):
ransac = RANSAC(min_samples=2, residual_threshold=2000, stop_n_inliers=int(0.98*times.shape[0]))
X = np.asarray(range(times.shape[0])).reshape(-1, 1)
reg = ransac.fit(X, times)
p = np.polyfit(X[reg.inlier_mask_].squeeze(), times[reg.inlier_mask_].squeeze(), deg=1)
p = [p[1], p[0]]
return p
def fix_timestamps(raw_data, residual_threshold=2000):
times = raw_data[:, :8].copy().view(np.int64)
params = get_time_fit(times)
tcomp = np.asarray(range(times.shape[0])).reshape(-1, 1) * params[1] + params[0]
outlier = np.abs(times - tcomp) > residual_threshold
data2 = raw_data.copy()
for i in range(times.shape[0]):
if outlier[i, 0]:
data2[i, :8] = convert_to_byte_array(np.int64(round(i * params[1] + params[0])))
return data2
So you could modify this code to create a function for fix_azimuths
for example.
I'll look into fixing the data and uploading that to S3 when I get a chance.
Thank you for your reply. I will give it a try, and if I come across any new findings, I will report them here. Many thanks! (I have tested the other sequences, and they appear to be functioning well. The results are consistent and correct, as you mentioned previously. )
For all the sequences other than boreas-objects-v1, this shouldn't be a problem.