smallshingshing
smallshingshing
I run `python train.py -data_pkl ./bpe_deen/bpe_vocab.pkl -train_path ./bpe_deen/deen-train -val_path ./bpe_deen/deen-val -log deen_bpe -embs_share_weight -proj_share_weight -label_smoothing -save_model trained -b 64 -warmup 128000 -epoch 400` But the training is slow and inaccurate....
There are 2d landmarks in the label, but I don't find 3d landmark label. Do I miss something?
I train the supernet with the default setting, and got an accuracy of `TRAIN Iter 150000: lr = 0.000003, loss = 2.805815, Top-1 err = 0.417285, Top-5 err = 0.202197,...
Using ntm as the default setting works fine on copy task, but it doesn't work on associative recall task. I run for more than 20000 training steps with 5 hours...
Hi, I use the default setting and the Single-Node Training Examples command line at [here](https://github.com/mit-han-lab/efficientvit/blob/master/applications/cls.md#training), but cannot reproduce the classification accuracy. For efficientvit-b1-224, I reach 79.26% top1 accuracy while you...