TrajectoryNet icon indicating copy to clipboard operation
TrajectoryNet copied to clipboard

Request to share trained weights.

Open SK124 opened this issue 5 years ago • 1 comments

Hi!

I am quite fascinated by the Trajectory Net paper and wanted to implement it. As I do not have the required GPU computational ability on my machine I have to rely on Google colab which also has GPU usage restrictions so I was wondering if the author's could share their trained weights on the two datasets they mentioned in the paper. As I am going to train on the same two datasets, it would be of my huge benefit if authors or anyone who trained it completely, could share their trained weights.

Also, I want to experiment the model on different datasets so I was wondering If transfer learning would be possible with this model, if so which layers would I need to modify?

Thanks and Regards.

@atong01 can you help me out here please?

SK124 avatar Sep 20 '20 19:09 SK124

Hi, thanks for your interest and sorry for not getting back sooner.

I would mention that because of the model architecture it trains almost as fast on a good CPU as GPU, in fact many of these models were trained purely on CPU because of a bug in pytorch at the time. I'll dig around for the weights I used in the paper and get back to you on those.

Transfer learning I think is difficult in single cell because of the differences in distributions of the cells, i.e. batch effects, that are present between runs and machines. I assume you mean freezing some of the "bottom" layers as in images for transfer learning but this does not make sense here.

atong01 avatar Oct 04 '20 13:10 atong01