bhack
bhack
As a side note there was a status update about the sparse support in the MLIR compiler stack https://llvm.discourse.group/t/mlir-support-for-sparse-tensors/2020/16
P.s. slides and video recording are available at https://llvm.discourse.group/t/mlir-support-for-sparse-tensors/2020/17
We have a little bit of Doc here on the reductuion parameter: https://github.com/tensorflow/models/blob/master/official/vision/keras_cv/losses/focal_loss.py#L37
I put this in the ecosystem review in the meantime cause I want to check how we want to handle this duplicated but not strictly aligned implementations.
@ravinderkhatri Keras-cv Is under refactoring. We have a PR at https://github.com/tensorflow/addons/pull/2422
Other then exactly reproducing our published nightly wheels with a local user build this is required to test the Github action at https://github.com/tensorflow/build/pull/48
AdamW is Keras now. Can you check their implementaton: https://github.com/keras-team/keras/blob/master/keras/optimizers/optimizer_experimental/adamw.py As It is in Keras we will probably route PRs there.
Theae are running in a `tf.function` in a classical model build/compile: https://github.com/tensorflow/tensorflow/issues/24874#issuecomment-460882184 Please open a ticket in Keras if you want some change. We are going to deprecate this quite...
If you reed the Note box in this link with tf.function/autograph the ops are executed in the respected order: https://www.tensorflow.org/api_docs/python/tf/control_dependencies /cc @mdanatg probably could details that claim internals in the...
Generally we require a little bit of stabilization in research papers so we wait for >50 citations.