FC-DenseNet icon indicating copy to clipboard operation
FC-DenseNet copied to clipboard

How was number of trainable parameters calculated?

Open Faur opened this issue 7 years ago • 5 comments

I have tried to re-implement the architecture described in the paper exactly, just in TensorFlow. But don't get the correct number of trainable parameters. I can't find where this is calculated, so I was hoping someone could help me out.

Paper: 56 layer: 1.5 mil. 103 layer: 9.4 mil

My implementation: 56 layer: 1.4 mil 103 layer: 9.2 mil

The discrepancy is small, so normally I wouldn't care, but I can't quite get the same performance results as in the paper, so perhaps this could help reveal any bugs in my code.

Faur avatar Nov 28 '17 12:11 Faur

could you tell me your best result?

dongzhuoyao avatar Mar 07 '18 06:03 dongzhuoyao

image

There were some issues with the 103 layer implementation (to small batch size and image resolution)

Faur avatar Mar 07 '18 11:03 Faur

@Faur @dongzhuoyao Hi, do you still remember the FLOPs of the model when you runed fc-densenet, the reviewer of my paper asked me to write the parameters and FLOPs values, but I failed to run because the dataset could not be loaded, I look forward to your reply, thank you very much

xiaomixiaomi123zm avatar Jun 05 '20 01:06 xiaomixiaomi123zm

@xiaomixiaomi123zm I am sorry, but I can't help you. I has been a while since I worked on this, and I don't have access to the code base as is.

But if the issue is just the dataset you should be able to make some dumme data with np.zeros quite easily - then you should be able to get the FLOPs of the model

Faur avatar Jun 05 '20 07:06 Faur

Hi, I don’t quite understand how to do it. Is it convenient for you to add my qq( 907675183 ) and tell me about it? I have tried a lot of ways, and I have not been successful, thank you very much!! @Faur

xiaomixiaomi123zm avatar Jun 05 '20 07:06 xiaomixiaomi123zm