bhack

Results 1416 comments of bhack

From the early comment I was talking about load and saved model. I don't think you have an helper to directly save and load single function cache.

I meant there is code to save concrete function but I think is not exposed in single public API call. You can check https://github.com/tensorflow/tensorflow/blob/v2.3.1/tensorflow/python/saved_model/save.py I suggest you to check the...

Yes this was just a little bit of support info if you want to try to prepare a PR. @bdch1234 I don't know if you are interested but I think...

@animesh0794 Are you working on recommender systems?

> @bhack yes I am. We have a new project in ecosystem: https://www.tensorflow.org/recommenders https://blog.tensorflow.org/2020/09/introducing-tensorflow-recommenders.html https://www.tensorflow.org/recommenders/examples/quickstart

Yes but you can still be inspired on how dataset are managed https://github.com/tensorflow/recommenders/blob/main/tensorflow_recommenders/examples/movielens.py

See the difference of the dataset `element_spec` with `True and False`: ``` ds = ds.batch(2,drop_remainder=False) print("Element spec:",ds.element_spec) ``` And check the the `lambda` in the two cases: https://github.com/tensorflow/tensorflow/blob/df86a9308a1112a8fd6cd8369a24ec84fa6cd125/tensorflow/python/data/ops/dataset_ops.py#L3906-L3917 So in...

@aaudiber: In the meantime what do you think about a small PR like (as it will cover some eager cases): ```python try: modulo = self._input_dataset.__len__() % self._batch_size except TypeError: modulo...

> @bhack I'd prefer to keep the behavior consistent between eager and graph mode, so that putting a @tf.function around eager code won't change the result of cardinality/**len**. OK but...