deformableLKA icon indicating copy to clipboard operation
deformableLKA copied to clipboard

The paper directly cites data from other papers?

Open apuomline opened this issue 7 months ago • 0 comments

"Hello, author. Your work on deformable DLKA is very impressive. However, we seem to have found an issue: in your paper, the DSC for the Synapse using the nnformer is 86.57%, which is exactly the same as the data run in the nnformer paper. But your experimental environment is not the same as the nnformer's environment. That is to say, you have achieved completely identical experimental results in different environments?" nnformer: image nnformer-experiment:We run all experiments based on Python 3.6, PyTorch 1.8.1 and Ubuntu 18.04. All training procedures have been performed on a single NVIDIA 2080 GPU with 11GB memory. The initial learning rate is set to 0.01 and we employ a “poly” decay strategy as described in Equation 7. The default optimizer is SGD where we set the momentum to 0.99. The weight decay is set to 3e-5. We utilize both cross entropy loss and dice loss by simply summing them up. The number of training epochs (i.e., max epoch in Equation 7) is 1000 and one epoch contains 250 iterations. The number of heads of multi-head self-attention used in different encoder stages is [6, 12, 24, 48] on Synapse. In the rest two datasets, the number of heads becomes [3, 6, 12, 24].

deformable dlka: image

deformable dlka-experiment: We have implemented both 2D and 3D models using the PyTorch framework and performed training on a single RTX 3090 GPU. For the 2D method, a batch size of 20 was used, along with Stochastic Gradient Descent (SGD) employing a base learning rate of 0.05, a momentum of 0.9, and a weight decay of 0.0001. The training process consisted of 400 epochs, employing a combination of crossentropy and Dice loss,

"In light of this, we are very curious to know whether the segmentation figures for the Synapse using the nnformer in your paper were taken directly from the nnformer's work?"

apuomline avatar Jul 21 '24 12:07 apuomline