DBow3
DBow3 copied to clipboard
Problem with calculating BoW vectors on own features
Hi guys, I have a problem with calculating BoW vectors on own feature descriptors. I use DBoW3 on FPFH feature descriptors designed for point cloud data (data from 3D LIDAR in my case). I have implemented the procedure of collecting features and building vocabulary following the code in demo_general.cpp. My implementation for one training sample is following:
cv::Mat features_tmp (features_cloud->points.size(), dimensionality, CV_32F);
for(size_t j = 0; j < features_cloud->points.size(); j++)
{
for(int idx = 0; idx < dimensionality; idx++)
{
features_tmp.at<float>(j, idx) = features_cloud->points[j].histogram[idx];
}
}
cv::Mat features = features_tmp;
The building of vocabulary works well. However when I use transform() method on the same feature descriptors I am not able to iterate over resulting BowVector object. The output of the ‘for’ loop with BowVector::iterator is empty. I use following code:
for(size_t i = 0; i < training_features.size(); i++)
{
cout << "Training frame " << i << endl;
BowVector frame_bow_vector;
voc.transform(training_features[i], frame_bow_vector);
cout << "BoW vector for training_frame " << i << ": " << endl;
for(BowVector::iterator it = frame_bow_vector.begin(); it != frame_bow_vector.end(); it++)
cout << it->first << ": " << it->second << ", ";
cout << endl;
training_bow_descriptors.push_back(frame_bow_vector);
}
Here training_features is std::vectorcv::Mat. If I print first element:
cout<<frame_bow_vector.begin()->first<<" "<<frame_bow_vector.begin()->second<<endl;
I get the same output for all the BoW vectors of reference samples: 0 5.30499e-315. When I use following code to test the vocabulary matching training samples against themselves I have score 0 for all the samples:
BowVector v1, v2;
for(size_t i = 0; i < training_features.size(); i++)
{
voc.transform(training_features[i], v1);
for(size_t j = 0; j < training_features.size(); j++)
{
voc.transform(training_features[j], v2);
double score = voc.score(v1, v2);
// cout << "Frame " << i << " vs Frame " << j << ": " << score << endl;
printf("Frame %d vs Frame %d: %.5f\n", i, j, score);
}
}
What could be a cause of this problem? Thank you in advance!
I add some codes to process the float point descriptors, and create a pull request. You can see the codes.
Thank you very much @YachiZhang! I will try once again and let you know whether it helped.
I pulled last updates from the repo, rebuilt the DBow3 shared library and updated my project. Sadly the problem is still there.
when using vectorcv::Mat ,you should use Mat.clone(). If the weights in the training result are all 0,the training result may be wrong.
Thank you for clarification! I will try that.
I changed the code with using Mat.clone() but still have the problem. I use the Mat.clone() like that:
void calculateFeatures(PointInTPtr& in, PointInTPtr& keypoints, NormalTPtr& normals, cv::Mat& features)
{
// Calculate features
...
cv::Mat features_tmp (fpfhs->points.size(), dimensionality, CV_32F);
for(size_t j = 0; j < fpfhs->points.size(); j++)
{
for(int idx = 0; idx < dimensionality; idx++)
{
features_tmp.at<float>(j, idx) = fpfhs->points[j].histogram[idx];
}
}
features = features_tmp.clone();
}
and
void loadFeatures(std::string path, cv::Mat& features)
{
// Load descriptors from the file
...
features.create(features_cloud->points.size(), dimensionality, CV_32F); // CV_32FC1
for(size_t j = 0; j < features_cloud->points.size(); j++)
{
for(int idx = 0; idx < dimensionality; idx++)
{
features.at<float>(j, idx) = features_cloud->points[j].histogram[idx];
}
}
}
cv::Mat frame_features;
feature_estimator->calculateFeatures(sample_cloud, keypoints, normals, frame_features);
training_features[i] = frame_features.clone();
...
}
else
{
cv::Mat frame_features;
feature_estimator->loadFeatures(descr_file, frame_features);
training_features[i] = frame_features.clone();
Is it correct?
I recently researched the FPFH descriptor using Lidar. But when I use the DBOW3 library, I can't get the correct result. I saw that you are also studying this aspect in github. Is it convenient to talk to you? @vovaekb