seya
seya copied to clipboard
Bringing up some extra Cosmo to Keras.
Hi- I tried the following install instructions, but they could not find seya At the command line: $ easy_install seya Or, if you have virtualenvwrapper installed: $ mkvirtualenv seya $...
I would like to have a Spatial Transformer layer before a pretrained convnet, such as the the Keras `ResNet50`. As such I have prepared the following various attempts to connect...
It seems like the keras' `build` function of the layers has changed its prototype and it now takes the `input_shape` as parameter, so [this](https://github.com/EderSantana/seya/blob/master/seya/layers/attention.py#L43) fails (see keras [documentation](https://keras.io/layers/writing-your-own-keras-layers/)). Regardless, once...
The easiest code example is below but the ipython notebook does not work either with keras 1.0.3 ``` import numpy as np from keras.layers import Input from keras.models import Sequential...
I tired seya with the lasted keras(1.0.1), unfortunately, Seya fails with the following outputs: /usr/bin/python2.7 /home/sun/seya/examples/imdb_brnn.py Using TensorFlow backend. Loading data... 20000 train sequences 5000 test sequences Pad sequences (samples...
See `In [5]` of https://github.com/EderSantana/seya/blob/master/examples/Spatial%20Transformer%20Networks.ipynb where the localisation network is defined. Is there a reason why the `Convolution2D` layers have no activations? And the final layer (responsible for regressing the...
I tried using ConvGRU but I'm getting runtime error ``` python import numpy as np from keras.models import Sequential from keras.layers.convolutional import Convolution2D from seya.layers.conv_rnn import ConvGRU from keras.layers.core import...
Hi I want to update your WinnerTakeAll layer to work with the new keras. So I implemented the Spatial + lifetime algorithm based on this article with the help of...
Hi, I hope I'm not bothering you. Recently, I have implemented a simple autoencoder with keras for text classification to do domain adaptation, but it performs worse than the original...
Apologies if I'm missing something obvious, but when you get the output for the bidirectional RNN: ``` Xf = self.forward.get_output(train) Xb = self.backward.get_output(train) Xb = Xb[::-1] ``` Seems you're reversing...