DeepTrade_keras icon indicating copy to clipboard operation
DeepTrade_keras copied to clipboard

relu_limited and risk_estimation definition

Open mg64ve opened this issue 5 years ago • 1 comments

Hello, nice code, congrats! I have a question regarding the 2 functions above. Can relu_limited and risk_estimation be defined inside the python code instead of $PYTHON_DIR/dist-packages/keras/losses.py and $PYTHON_DIR/dist-packages/keras/activations.py ? I want to run this in a docker container and it would be simpler if I can define them, for instance, in gossip.py. Please let me know what you think about it.

mg64ve avatar Jan 10 '19 08:01 mg64ve

Hi, @mg64ve You can add a few lines of code to gossip.py. To not change the source code keras

from keras import backend as K
from keras.utils.generic_utils import get_custom_objects
class ReluLimited(Activation):
    def __init__(self, activation, **kwargs):
        super(ReluLimited, self).__init__(activation, **kwargs)
        self.__name__ = 'ReluLimited'

def relu_limited(x, alpha=0., max_value=1.):
    return K.relu(x, alpha=alpha, max_value=max_value)

get_custom_objects().update({'relu_limited': ReluLimited(relu_limited)})


def risk_estimation(y_true, y_pred):
    return -100. * K.mean((y_true - 0.0002) * y_pred)

And change the WindPuller initialization function.

def __init__(self, input_shape, lr=0.01, n_layers=2, n_hidden=8, rate_dropout=0.2, loss=risk_estimation):
        print("initializing..., learing rate %s, n_layers %s, n_hidden %s, dropout rate %s." %(lr, n_layers, n_hidden, rate_dropout))
        self.model = Sequential()
        self.model.add(Dropout(rate=rate_dropout, input_shape=(input_shape[0], input_shape[1])))
        for i in range(0, n_layers - 1):
            self.model.add(LSTM(n_hidden * 4, return_sequences=True, activation='tanh',
                                recurrent_activation='hard_sigmoid', kernel_initializer='glorot_uniform',
                                recurrent_initializer='orthogonal', bias_initializer='zeros',
                                dropout=rate_dropout, recurrent_dropout=rate_dropout))
        self.model.add(LSTM(n_hidden, return_sequences=False, activation='tanh',
                                recurrent_activation='hard_sigmoid', kernel_initializer='glorot_uniform',
                                recurrent_initializer='orthogonal', bias_initializer='zeros',
                                dropout=rate_dropout, recurrent_dropout=rate_dropout))
        self.model.add(Dense(1, kernel_initializer=initializers.glorot_uniform()))
        # self.model.add(BatchNormalization(axis=-1, moving_mean_initializer=Constant(value=0.5),
        #               moving_variance_initializer=Constant(value=0.25)))
        self.model.add(BatchRenormalization(axis=-1, beta_init=Constant(value=0.5)))
        self.model.add(Activation('relu_limited'))
        opt = RMSprop(lr=lr)
        self.model.compile(loss=loss,
                      optimizer=opt,
                      metrics=['accuracy'])

This option works in Google Colab

mr8bit avatar Jun 12 '19 23:06 mr8bit