ProGamerGov

Results 515 comments of ProGamerGov

I mostly used a 1050ti (Google Colab was used for earlier experiments until I realized that I could do it on my laptop) and it took a couple of hours...

@Mayukhdeb So, neural-dream's FFT decorrelation doesn't actually do decorrelation. I mistakenly made it alter the frequencies instead. My [dream-creator](https://github.com/ProGamerGov/dream-creator) project has fully functional decorrelation modules, and I would recommend referring...

@aertist The features seems to be underdeveloped in the second image, so maybe you accidentally used fewer iterations?

@aertist Normally I avoid video because testing takes a long lime, and because it normally requires another library for temporal coherence. Thank you! And it's always really cool to see...

@ad48hp That issue occurs when the smallest octave image size is too small for the model. A quick test with the GoogleNet Caffe models shows that they require both height...

@ad48hp I'll look into it. Have you tried using `-backend mkl` when using the CPU at all? That should make it faster.

The `-learning_rate` parameter is the same thing as the 'step size' on other DeepDream projects. On Dreamify, it's called `step_size`. It's the size of the 'jump' or 'step' towards the...

@ad48hp For the moment you can try this: https://github.com/ProGamerGov/neural-dream/tree/output-image-name. The `-output_start_num` parameter will let you start the output number at a number of your choosing. I'll probably delete the branch...

@ad48hp The parameters don't seem to raise that PyTorch error that you posted above. The content image you are using seems extremely bright, and using `-adjust_contrast 99.999` seems to change...

The learning rate that you are using seems like it might be a bit too high. Lowering it to 1.6 resulted in this output at 29 iterations: ![out_29_lp0](https://user-images.githubusercontent.com/10626398/77576244-809ccc80-6e9a-11ea-9ae7-5f50829b641b.png) Using `-lap_scale...