delight
delight copied to clipboard
Training for deep models
For deep model training, can the corresponding parameters be shared, such as training 60 layers?