Sander Dieleman

Results 136 comments of Sander Dieleman

Yep, that is pretty sweet :) If I could just pass in a layer representing my network I would already be pretty happy with that! With regards to Keras's `compile`...

> I think it would be good to have a separation between model container / training loop and dataset container / batch generator. agreed. > The model would have train...

> nolearn.lasagne has net.initialize() that does the same thing as compile but it's implicitly called during fit() if it hasn't been called by the user before. But a difference I...

> This is sorta what I think should be avoided. Now there needs to be a callback API, and some standard for accessing history. I think it's a lot less...

> So if you want to support a fit() method that takes the entire dataset (such as it's often done in sklearn), you'll need to split it into batches, and...

As mentioned in the discussion before, I think data loading should be offloaded to the user entirely, as supporting various data formats directly in the library would make things a...

This would be great to have, but I think it falls outside the scope of the main library a bit (and would require extra dependencies), so I think it's more...

Sorry, I should have linked to that directly! Here it is: https://github.com/Lasagne/Recipes/blob/master/utils/network_repr.py

> the factor and the fixed output shape would result in different behaviour for variable input sizes (the factor would adapt to the size, the fixed shape would be, well,...

I see. But currently the input is not being rescaled before the transform is applied, right? I just think it's a bit strange outside of that context, my thought process...