Daniel Nouri

Results 55 comments of Daniel Nouri

Interesting about the release of the GIL and the example code provided. I was looking to put a thread around my batch generator, but will still have to figure out...

On 10/14/2014 03:51 PM, Jan Schlüter wrote: > ``` > I like the flexibility; it allows me to do data augmentation while I > generate batches, FlipBatchGenerator, CropBatchGenerator etc. >...

On 10/14/2014 11:36 PM, Jan Schlüter wrote: > ``` > I'm still wondering how it would compare to do this on-the-fly > agumentation inside a dedicated layer. Maybe it can...

On 10/17/2014 12:21 PM, Jan Schlüter wrote: > ``` > I'll try and stare at feed_minibatches() a little bit more, because I > definitely like the idea of making it...

> Of course I know there is nolearn, but one thing I dislike about it is how it requires you to specify your network architecture in a non-standard way, for...

> With regards to Keras's `compile` method, I suppose one reason to split this would be to make this step a bit more explicit, because it can take a long...

> In some cases it might be limiting to treat labels as a special type of input. Most tasks fit the (X, y)-paradigm (i.e. data + labels), but there are...

> I've trained a siamese net with nolearn, it's working pretty well. I've even been using your trick of having an input that's double the size as the labels, and...

> 'end of the world' is a bit strong indeed, but it is a bit of a wart imo. We should avoid forcing users to use things in unintuitive ways,...

> If I'm not mistaken you don't specifically need `BatchIterator`, anything which implements the Python iterator interface should do, right? Yes, you're right. > I think we should provide both,...