gandlf icon indicating copy to clipboard operation
gandlf copied to clipboard

"sigmoid.0 is not in list" error when I run example/mnist_gan.py

Open AlexZhou1995 opened this issue 8 years ago • 4 comments

Hi, Thank you for develop this and gandlf looks really good. But I met "sigmoid.0 is not in list" error when I run example/mnist_gan.py.

error logs shows as follow: err1

And many same errors like: err2

And in the end: err3

AlexZhou1995 avatar Feb 22 '17 09:02 AlexZhou1995

I got the same error in xor.py without using GPU.

I install gandlf from pip and update keras from 1.1.2 to 1.2.2.

AlexZhou1995 avatar Feb 22 '17 09:02 AlexZhou1995

Same here, while trying the MNIST example. using: Theano: 0.8.2 Keras: 1.2.2 Gandlf: 0.0.5

error

Downloading data from https://s3.amazonaws.com/img-datasets/mnist.pkl.gz

ValueError Traceback (most recent call last) in () 310 y_train_ohe = np.eye(10)[np.squeeze(y_train)] 311 --> 312 model = train_model(args, X_train, y_train, y_train_ohe) 313 314 if args.plot:

in train_model(args, X_train, y_train, y_train_ohe) 243 inputs += [X_train] 244 --> 245 model.fit(inputs, outputs, nb_epoch=args.nb_epoch, batch_size=args.nb_batch) 246 247 return model

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/gandlf/models.pyc in fit(self, x, y, batch_size, nb_epoch, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch) 891 output_names, output_shapes, False, batch_size) 892 --> 893 self._make_train_function() 894 train_fn = self.train_function 895

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/gandlf/models.pyc in _make_train_function(self) 562 self._collected_trainable_weights[0], 563 self.constraints, --> 564 self.generator_loss) 565 566 # Gets the discriminator updates.

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/keras/optimizers.pyc in get_updates(self, params, constraints, loss) 414 415 def get_updates(self, params, constraints, loss): --> 416 grads = self.get_gradients(loss, params) 417 self.updates = [K.update_add(self.iterations, 1)] 418

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/keras/optimizers.pyc in get_gradients(self, loss, params) 80 81 def get_gradients(self, loss, params): ---> 82 grads = K.gradients(loss, params) 83 if hasattr(self, 'clipnorm') and self.clipnorm > 0: 84 norm = K.sqrt(sum([K.sum(K.square(g)) for g in grads]))

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/keras/backend/theano_backend.pyc in gradients(loss, variables) 971 972 def gradients(loss, variables): --> 973 return T.grad(loss, variables) 974 975

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in grad(cost, wrt, consider_constant, disconnected_inputs, add_names, known_grads, return_disconnected, null_gradients) 458 459 var_to_app_to_idx = _populate_var_to_app_to_idx( --> 460 outputs, wrt, consider_constant) 461 462 # build a dict mapping var to the gradient of cost with respect to var

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in _populate_var_to_app_to_idx(outputs, wrt, consider_constant) 885 # add all variables that are true ancestors of the cost 886 for output in outputs: --> 887 account_for(output) 888 889 # determine which variables have elements of wrt as a true

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 881 if i not in idx: 882 idx.append(i) --> 883 account_for(ipt) 884 885 # add all variables that are true ancestors of the cost

/Users/willem/dev/ml/keras/mlpy/lib/python2.7/site-packages/theano/gradient.pyc in account_for(var) 858 connection_pattern = _node_to_pattern(app) 859 --> 860 var_idx = app.outputs.index(var) 861 862 for i, ipt in enumerate(app.inputs):

ValueError: sigmoid.0 is not in list

smartsystems4u avatar Mar 23 '17 20:03 smartsystems4u

This seems like a Keras issue. After the new updates Gandlf is totally broken, though... It's disheartening

codekansas avatar Mar 30 '17 17:03 codekansas

Reading The Keras blog the 2.x API change is for the ages :)

smartsystems4u avatar Mar 30 '17 18:03 smartsystems4u