nnUNet icon indicating copy to clipboard operation
nnUNet copied to clipboard

decrease batch size and epochs

Open AsifAhamed3720 opened this issue 9 months ago • 1 comments

Hi, anyway to decrease batch size for training? if so, where can i change it? Because there are a lot of files defining batch size.

AsifAhamed3720 avatar Mar 19 '25 19:03 AsifAhamed3720

Yup , we can decrease the batchsize from nnUNetPlans.json once the preprocessed file is created, it will be here :



    "configurations": {
        "2d": {
            "data_identifier": "nnUNetPlans_2d",
            "preprocessor_name": "DefaultPreprocessor",
            "batch_size": 8,
            "patch_size": [
                1280,
                1280
            ],
            "median_image_size_in_voxels": [
                2822.0,
                2423.0
            ],
            "spacing": [
                1.0,
                1.0
            ],


And the training epoch can be reduced from nnUNetTrainer.py located at nnunetv2/training/nnUNetTrainer/nnUNetTrainer.py , line 151 (self.num_epochs = 1000)



        ### Some hyperparameters for you to fiddle with
        self.initial_lr = 1e-2
        self.weight_decay = 3e-5
        self.oversample_foreground_percent = 0.33
        self.probabilistic_oversampling = False
        self.num_iterations_per_epoch = 250
        self.num_val_iterations_per_epoch = 50
        self.num_epochs = 1000
        self.current_epoch = 0
        self.enable_deep_supervision = True

NishantDahal avatar Aug 08 '25 05:08 NishantDahal