cadama

Results 4 comments of cadama

I am facing the very same error. Here is my set up: Schema of the table I am trying to query: ``` campaignid | INTEGER | NULLABLE |   smth |...

There exist a protobuf limit reported here: https://stackoverflow.com/questions/34128872/google-protobuf-maximum-size/34186672 This is not a memory limit of the machine. How can one train a dataset that exceeds this size?

I tried the solution proposed by @mcourteaux but I stumbled in a different error. Here is my code ``` import six from tensorflow.python.keras import backend as K from tensorflow.python.keras.utils.tf_utils import...

+1 for this. I would also be happy with any hack that makes this possible.