rt

Results 2 issues of rt

Hello, thank you for sharing this wonderful package. Hello, I would like to ask, how many GPU cards did you use and how long did it take to train the...

您好,我想请教一下,预训练阶段的loss一般会收敛到多少?我现在在自己的数据上跑预mae-b的预训练,发现loss基本在0.2左右就不再下降了。然后把重建的结果绘制在原图上,发现都比较模糊。