translob icon indicating copy to clipboard operation
translob copied to clipboard

Hyperparameters tuning

Open PotatoAim101 opened this issue 3 years ago • 0 comments

Hello! Thank you for the interesting results in your paper. I had a small question. I your paper you say this:

Due to the limited nature of the FI-2010 dataset, significant time was spend tuning
hyperparameters of our model to negate overfitting. In particular, our architecture was
notably sensitive to the initialization. However, due to the very strong performance of
the model, together with the flexibility and sensible inductive biases of the architecture,
we expect robust results on larger LOB datasets

How did you choose the initial weights? Did you use any specific method to tune the hyperparameters? Could you tell me more about this?

PotatoAim101 avatar Jul 15 '21 11:07 PotatoAim101