C. Antonio Sánchez
C. Antonio Sánchez
> Two issues I see: > > * It creates circular references (optimizers own variables and variables may own an optimizer) > * It feels like a layering violation (optimizers...
> Could consider some design where this is handled at a custom layer level? Go for a similar vibe to add_loss https://keras.io/api/losses/#the-addloss-api where a optimizer is attached during build or...
@hertschuh @fchollet Potential replacement for #21196. With this approach, layering is mostly preserved (although in our case the custom updater would still actually contain an optimizer - but at least...
> does not accept a batch argument for contrast adjustment I'm not certain I understand what this means. It is assumed the last dimension are the three channels of RGB,...
I added more comments on the associated TF bug, but mentioning this here: I'm pretty sure it is a keras "issue" (though may not be a bug). We get similarly...
@dzarukin fixing an Eigen issue encountered internally at Google
> Hi @cantonios, thanks for taking care about it. I have a PR opened where I completely got rid of it: #4087 and switch to `ThreadPool` object instead for synchronous...