keras
keras copied to clipboard
Deep Learning library for Python. Convnets, recurrent neural networks, and more. Runs on MXNet, Theano or TensorFlow.
Here is the output: Using MXNet backend. Traceback (most recent call last): File "run.py", line 128, in model = Model(config_args['alpha'], config_args['gamma'], config_args['input_size'], config_args['hidden_size']) File "/tmp/Model.py", line 37, in __init__ kernel_initializer='glorot_normal'))...
resolves #85
Hi, Wanted to ask what are the plans to pull in Keras 2.0 as there have been interface level changes from 1.2.2. Rahul
``` model.add(Embedding(max_features, embedding_dims, input_length=maxlen, dropout=0.2)) ``` This line throws this exception: ``` train_ret = func(*args, **kwargs) File "env\lib\site-packages\keras\backend\mxnet_backend.py", line 3007, in random_binomial raise NotImplementedError NotImplementedError ```
self.mod_ids = Lambda(lambda sent: sent % (nr_tune-1)+1, output_shape=(self.max_length,)) Returns an error saying - TypeError: unsupported operand type(s) for %: 'KerasSymbol' and 'int'
The line: LSTM (1000,activation = 'tanh',return_sequences = True)(ht) produces a strange error for me: /usr/local/lib/python3.5/dist-packages/keras/layers/recurrent.py in get_initial_states(self, x) 202 def get_initial_states(self, x): 203 # build an all-zero tensor of shape...
Hi, Now I am using Keras 1.2.2 with mxnet backend on ResNet50. I observed a very weird phenomenon: If we use **mxnet** as backend, when we finish training and save...
This prevents the unfortunate scenario in which layer.set_weights() is called before training on some or all layers in a model, and then the model is trained and saved. When the...
I turn my keras backend from tensorflow to mxnet to use mxnet's multi gpu training. However the code I run successfully in keras tensorflow backend seems not compatible with keras...