Applio icon indicating copy to clipboard operation
Applio copied to clipboard

[Bug]: bug on overtraining

Open kro-ai opened this issue 1 year ago • 1 comments

Project Version

3.2.5

Platform and OS Version

Windows 11, Firefox 130

Affected Devices

PC

Existing Issues

No response

What happened?

Instead of showing the actual number of epochs remaining for overtraining, it just says "g/total: 100 d/total: 200". It doesn't change.

Steps to reproduce

  1. Preprocess
  2. Extract
  3. Start training and look in the terminal ...

Expected behavior

I expect it to show the number of epochs remaining for overtraining.

Attachments

No response

Screenshots or Videos

bug

Additional Information

No response

kro-ai avatar Sep 19 '24 20:09 kro-ai

@ShiromiyaG

blaisewf avatar Sep 19 '24 20:09 blaisewf

@blaisewf with all due respect, what do you mean "not planned"? In my opinion, this is an incredibly important feature, and right now It's broken and pointless as It's not giving the user any useful information whatsoever. I think closing an issue, and tagging an ACTUAL bug as not planned is incredibly strange?

kro-ai avatar Nov 01 '24 21:11 kro-ai

@kro-ai did you actually reach the point of overtraining? you've configured overtraining threshold at 100 epochs, so it means the generator loss has to go up consecutively for 100 epochs before the trainig stops

AznamirWoW avatar Nov 01 '24 21:11 AznamirWoW

@kro-ai did you actually reach the point of overtraining? you've configured overtraining threshold at 100 epochs, so it means the generator loss has to go up consecutively for 100 epochs before the trainig stops

I was under the impression that "number of epochs remaining for overtraining" would show the number of epochs? for example "number of epochs remaining for overtraining: 98" and then it keeps going down, until it reaches 0 and then training stops because it hasnt found a better epoch. Then when it finds a new best epoch It resets to the set number of epochs, in this case 100. Maybe I've completely misunderstood it and If so I apologize. This is a feature Applio used to have. I was under the impression this was that feature?

kro-ai avatar Nov 01 '24 22:11 kro-ai

@kro-ai did you actually reach the point of overtraining? you've configured overtraining threshold at 100 epochs, so it means the generator loss has to go up consecutively for 100 epochs before the trainig stops

I was under the impression that "number of epochs remaining for overtraining" would show the number of epochs? for example "number of epochs remaining for overtraining: 98" and then it keeps going down, until it reaches 0 and then training stops because it hasnt found a better epoch. Then when it finds a new best epoch It resets to the set number of epochs, in this case 100. Maybe I've completely misunderstood it and If so I apologize. This is a feature Applio used to have. I was under the impression this was that feature?

you set a threshold - 50 epochs. At the end of the epoch the detector checks whether the loss is going down or up. If it goes up for 50 epochs it stops. The output shows the progress to this stop as 'selected threshold - number of epochs that consistently increased the loss'

AznamirWoW avatar Nov 01 '24 23:11 AznamirWoW