Sander Dieleman

Results 136 comments of Sander Dieleman

I expected that we'd probably see eye to eye on this ;) doesn't have to be just two levels by the way, I just wanted to give an example to...

Cool :) The `train()` generator I was working on had a flexible interface, allowing for the model and training setup to be specified in a variety of ways and using...

> I could see an issue if you wanted to implement a GAN for example, as you have to alternate between updating the discriminator and generator. True, if `train()` is...

> > Construct your network such that it processes both parts at once, and split them up afterwards. I.e., instead of: > > That's awesome! You have probably just saved...

I like @f0k's idea of allowing this `DictMergeLayer` to work with both separate layers providing the different inputs, and a single layer providing a dict of inputs (or combinations of...

I think if we have layers that return multiple outputs, it might be best if only layers that can accept multiple inputs can be stacked on top of them. This...

I think this was the reason I originally put `*args` in the get_output_for signature along with `**kwargs`. Although that probably wasn't a good solution anyway. The 'side output' stuff sounds...

The most appealing option to me is also to have a single 'object' that represents the output of a layer, and maybe this object could be something other than a...

It does, sort of, but what if someone suddenly comes up with a layer that returns a completely different data type, say, a string? If we make things data type...

> I think I prefer the second option. As far as I see, it wouldn't require any changes to Lasagne, we could just add layers that return dictionaries and others...