alanspike

Results 5 comments of alanspike

79.2 for L1 is trained w/ distillation for 300 epochs. We haven't trained the model w/o distillation, but can try it later.

Hi @youjinChung, thanks again for your interests. Just checking whether you could run the code successfully.

Hi @edwardyehuang, with the latest iOS and Xcode, you could get the latency directly on CPU or CPU & GPU following this [link](https://developer.apple.com/videos/play/wwdc2022/10027/).

Hi @yeyan00 , did you train the model on your own dataset or use the pre-trained model from ADE20K on your dataset?

Hi @yeyan00, I'll close the issue for now. Please feel free to open it if you'd like to discuss more about the model/dataset training. Thanks.