Multi-Cell_LSTM
Multi-Cell_LSTM copied to clipboard
Q&A: About the batch size
If I make the batch_size larger, will it make the model look worse?If it is convenient, can you tell me the parameter change after for example batch_size increases to 32?
The batch size can affect the performances to some extent, which is related to the dataset itself.
We use a default batch size = 10 in all experiments with no change. The number of parameters has no relation to the batch size.
发件人: Mr.Lymailto:[email protected] 发送时间: 2020年7月31日 21:44 收件人: jiachenwestlake/Multi-Cell_LSTMmailto:[email protected] 抄送: Subscribedmailto:[email protected] 主题: [jiachenwestlake/Multi-Cell_LSTM] Q&A: About the batch size (#2)
If I make the batch_size larger, will it make the model look worse?If it is convenient, can you tell me the parameter change after for example batch_size increases to 32?
― You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHubhttps://github.com/jiachenwestlake/Multi-Cell_LSTM/issues/2, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ALS6HW4J2ZFVQV2EPPYZAKDR6LDEFANCNFSM4PQI6LPQ.
Hello, the parameter Settings I mentioned here are the values of HP, not the model parameters. Have you ever tried to enlarge batch_size? Setting 10 to run 100 epochs really took a lot of time.
Hi, recently, I also tried to reproduce the code, And did you try the parameter epoch 10? when I use the 100 epochs, I get the better result on the broad twitter (79.04) than the one reported in the paper (78.43). So, I think maybe the setting for epoch in the paper is 10 or less than 100.
Yes, you are right! The default batch size in this paper is 10.