batch size
Hello, I used your code with a batch size of 8 during the training process, but the test results are slightly lower than those reported in your paper. May I ask if you trained with a batch size of 8 or 64?
batchsize is 64. 8 is the number of classification in a batchsize.
---- Replied Message ---- | From | @.> | | Date | 10/11/2024 16:15 | | To | @.> | | Cc | @.***> | | Subject | [chenjingong/DN-ReID] batch size (Issue #4) |
Hello, I used your code with a batch size of 8 during the training process, but the test results are slightly lower than those reported in your paper. May I ask if you trained with a batch size of 8 or 64?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.***>
parser.add_argument('--batch-size', default=8, type=int, metavar='B', help='training batch size') Is switching the batch size to 64 sufficient, or do I need to make other modifications as well?
I think you can use the code directly after you change the datepath
---- Replied Message ---- | From | @.> | | Date | 10/11/2024 16:27 | | To | @.> | | Cc | @.>@.> | | Subject | Re: [chenjingong/DN-ReID] batch size (Issue #4) |
Is switching the batch size to 64 sufficient, or do I need to make other modifications as well?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>