verse icon indicating copy to clipboard operation
verse copied to clipboard

How to get the vertebral center of ground truth?

Open zougougou opened this issue 3 years ago • 12 comments

Hi, I use the method of calculating the vertebral center in your code, and the result is inconsistent with the real center in the data set. How can I calculate the real center?

zougougou avatar Jul 30 '21 01:07 zougougou

Thank you for your interest. Which set of data are you talking about, the MICCAI version or the subject-specific version? (Because the centroids are annotated differently in these two sets)

anjany avatar Jul 30 '21 15:07 anjany

@anjany Hi,Thank you for organizing and sharing the data. I am using VerSe'20 data. I would like to ask how the vertebrae centroids of the json file in challenge data are converted to subject-based data centroids?I use the reorient_centroids_to function to convert centroids from subject-based to various directions, but the challenge data value shared with you does not correspond to each other, so I want to know how they are converted. In addition, do you know how the coordinates of vertebrae centroids are annotated? What annotation software is used?

cloudpanl avatar Aug 03 '21 08:08 cloudpanl

Thank you for your interest. Which set of data are you talking about, the MICCAI version or the subject-specific version? (Because the centroids are annotated differently in these two sets)

Thank you for your reply. I use the data set downloaded by the link in WGet you published. The center mark of this data set is different from that calculated by your code.

zougougou avatar Aug 03 '21 08:08 zougougou

Hi guys: Sorry for the delay in response. I was on a holiday.

I use the method of calculating the vertebral center in your code...

@zougougou : WGET will get you the subject-specific version. My code (prepare_data.ipynb) does not calculate any centroids. So, I'm not sure what you mean by them not being the same as the real centres. Would you mind clarifying?

I would like to ask how the vertebrae centroids of the json file in challenge data are converted to subject-based data centroids

@cloudpanl: Following are the differences:

  1. In MICCAI 2019: centroids were in ASL orientation, 1mm spacing. The centroid was placed by a human on the 'perceived' centre of mass of the vertebral body using ANDUIN (anduin.bonescreen.de).
  2. In MICCAI 2020: centroids were in ASL orientation, 1mm spacing. However, eliminating the perceived human-component, centroids were just the calculated center-of-masses of the vertebral masks.
  3. Lastly, in the subject-specific restructured version (which we recommend for future research), centroids are in the orientation and spacing of the native scan. Moreover, they are calculated as the centre of mass of the vertebral body, as segmented by ANDUIN.

Hope this information provides some context. Note that all this information is present in the README files that come withe the dataset and in the associated references.

anjany avatar Aug 23 '21 08:08 anjany

Thanks for your reply, I get it.

cloudpanl avatar Aug 26 '21 08:08 cloudpanl

@anjany Thanks for your reply. In the /utils/data_utilities.py code you provided, there is the following code:

def calc_centroids(msk, decimals=1, world=False): """Gets the centroids from a nifti mask by calculating the centers of mass of each vertebra

Parameters:
----------
msk: nibabel nifti mask
decimals: rounds the coordinates x decimal digits

Returns:
----------
ctd_list: list of centroids 

"""
msk_data = np.asanyarray(msk.dataobj, dtype=msk.dataobj.dtype)
axc = nio.aff2axcodes(msk.affine)
ctd_list = [axc]
verts = np.unique(msk_data)[1:]
verts = verts[~np.isnan(verts)]  # remove NaN values
for i in verts:
    msk_temp = np.zeros(msk_data.shape, dtype=bool)
    msk_temp[msk_data == i] = True
    ctr_mass = center_of_mass(msk_temp)
    if world:
        ctr_mass = msk.affine[:3, :3].dot(ctr_mass) + msk.affine[:3, 3]
        ctr_mass = ctr_mass.tolist()
    ctd_list.append([i] + [round(x, decimals) for x in ctr_mass])
return ctd_list

I use this code to calculate the central point, and the data obtained is inconsistent with. JSON in subject specific version.

zougougou avatar Aug 29 '21 01:08 zougougou

My code snippet in the now-closed duplicate issue demonstrates the inconsistencies for the calculation of the centroid compared to those stated in the .json file.

The problem is that if we can't replicate the results for the ground truth images, then we can't use it to extract centroids from our network-inferred images. And if we can't get the centroids of network-inferred images, then we can't use the id_rate as a metric.

rijobro avatar Dec 01 '21 12:12 rijobro

Hi @zougougou and @rijobro: Did you check this comment: https://github.com/anjany/verse/issues/3#issuecomment-903571440

Now, the calc-centroid works only for the scans in point (3) in the above comment. So, were you guys working with the subject-specific scans of VerSe?

anjany avatar Dec 02 '21 16:12 anjany

Ok, I now understand the code won't work for the subject-based data.

My code (prepare_data.ipynb) does not calculate any centroids.

Although you don't calculate any centroids in prepare_data.ipynb, the function is present in eval_utilities.py, so it's naturally confusing that we can't replicate results.

rijobro avatar Dec 02 '21 16:12 rijobro

So you were, in fact, working with subject-based data? The code in data_utils.py should then work, in principle. Could you send me the name of the file you worked with?

anjany avatar Dec 03 '21 10:12 anjany

The code snippet mentioned here should highlight what I mean.

I think that you are saying that the functionality works, but when using the subject-based data it will give different results compared to the results given in the .json file. It's just therefore confusing that the json file is included in the subject-based dataset if it isn't applicable in this case.

Apologies if I've misunderstood something.

rijobro avatar Dec 03 '21 16:12 rijobro

Hi, anjany, thanks for you reply, Now, i know,how to convert the json, I'll summarize your plan in Chinese。 在脊椎分割方案中,对应的质心坐标是在数据集ASL方向得到的结果,所以大家在进行质心训练的时候,请将数据集转换到对应的ASL方向。输入的坐标需要将所有的值除以对应的spacing,然后可以利用itk-snap软件进行查看啦~~~, 请帮我点个赞哦,哈哈哈,看了一个下午的代码才想通的 / emoji

rw1995 avatar Jan 14 '22 08:01 rw1995