Joe Penna

Results 3 issues of Joe Penna

I've seen this mentioned in some other threads here, but figured I'd start a new issue. This is what my trained model looked like before: ![B63FC792-1BFF-40DB-9F8A-D82D766428EF](https://user-images.githubusercontent.com/100188076/189505949-ade1c630-e33b-4bcc-96b9-f4511d60da8c.png) And after training someone...

Starting this "issue" to show how easy it is to overtrain an object. 200 regularization images. 13 subject photos. 2000 steps. ## `GROUND TRUTH` ![lic_guitar_112](https://user-images.githubusercontent.com/100188076/190392679-1f9e4906-3b16-41cc-8212-37297a69615a.jpg) ![lic_guitar_106](https://user-images.githubusercontent.com/100188076/190392740-c2cd4b27-1a91-48e7-af17-dd03d7e52d21.jpg) # Initial Test Generations...

It seems like the entire latent space is shifted towards what you're training. And also, the longer you train, the more is affected. However, @nikopueringer was figuring out what's missing...