FC-DenseNet
FC-DenseNet copied to clipboard
How was number of trainable parameters calculated?
I have tried to re-implement the architecture described in the paper exactly, just in TensorFlow. But don't get the correct number of trainable parameters. I can't find where this is calculated, so I was hoping someone could help me out.
Paper: 56 layer: 1.5 mil. 103 layer: 9.4 mil
My implementation: 56 layer: 1.4 mil 103 layer: 9.2 mil
The discrepancy is small, so normally I wouldn't care, but I can't quite get the same performance results as in the paper, so perhaps this could help reveal any bugs in my code.
could you tell me your best result?
There were some issues with the 103 layer implementation (to small batch size and image resolution)
@Faur @dongzhuoyao Hi, do you still remember the FLOPs of the model when you runed fc-densenet, the reviewer of my paper asked me to write the parameters and FLOPs values, but I failed to run because the dataset could not be loaded, I look forward to your reply, thank you very much
@xiaomixiaomi123zm I am sorry, but I can't help you. I has been a while since I worked on this, and I don't have access to the code base as is.
But if the issue is just the dataset you should be able to make some dumme data with np.zeros
quite easily - then you should be able to get the FLOPs of the model
Hi, I don’t quite understand how to do it. Is it convenient for you to add my qq( 907675183 ) and tell me about it? I have tried a lot of ways, and I have not been successful, thank you very much!! @Faur