Mask_RCNN icon indicating copy to clipboard operation
Mask_RCNN copied to clipboard

WARNING:root:You are using the default load_mask(), maybe you need to define your own one.

Open Rajnigoyal88 opened this issue 1 year ago • 1 comments

Hello friends, I am currently working on training the Mask R-CNN model for the crop weed class using a custom dataset. However, I have been encountering a persistent warning in a loop when running the model on TensorFlow 2.8 with Keras 2.8. Surprisingly, even when attempting to run the model on existing datasets available on GitHub, I still face the same warning. I have been struggling with this issue for the past few weeks, and I would greatly appreciate any assistance in resolving it. Your help would be highly valuable to me.

Checkpoint Path: /content/drive/MyDrive/mask_aarohi/Mask-R-CNN-using-Tensorflow2/logs/object20230706T0833/mask_rcnn_object_{epoch:04d}.h5 Selecting layers to train fpn_c5p5 (Conv2D) fpn_c4p4 (Conv2D) fpn_c3p3 (Conv2D) fpn_c2p2 (Conv2D) fpn_p5 (Conv2D) fpn_p2 (Conv2D) fpn_p3 (Conv2D) fpn_p4 (Conv2D) rpn_model (Functional) mrcnn_mask_conv1 (TimeDistributed) mrcnn_mask_bn1 (TimeDistributed) mrcnn_mask_conv2 (TimeDistributed) mrcnn_mask_bn2 (TimeDistributed) mrcnn_class_conv1 (TimeDistributed) mrcnn_class_bn1 (TimeDistributed) mrcnn_mask_conv3 (TimeDistributed) mrcnn_mask_bn3 (TimeDistributed) mrcnn_class_conv2 (TimeDistributed) mrcnn_class_bn2 (TimeDistributed) mrcnn_mask_conv4 (TimeDistributed) mrcnn_mask_bn4 (TimeDistributed) mrcnn_bbox_fc (TimeDistributed) mrcnn_mask_deconv (TimeDistributed) mrcnn_class_logits (TimeDistributed) mrcnn_mask (TimeDistributed) /usr/local/lib/python3.10/dist-packages/keras/optimizer_v2/gradient_descent.py:102: UserWarning: The lr argument is deprecated, use learning_rate instead. super(SGD, self).init(name, **kwargs) Epoch 1/300 WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one. WARNING:root:You are using the default load_mask(), maybe you need to define your own one.

Rajnigoyal88 avatar Jul 06 '23 09:07 Rajnigoyal88

This doesn't solve the Warning you are getting, but to fix this issue: /usr/local/lib/python3.10/dist-packages/keras/optimizer_v2/gradient_descent.py:102: UserWarning: The lr argument is deprecated, use learning_rate instead.

Do a search of your code for the following and change the lr to learning_rate:

# Optimizer object
        optimizer = keras.optimizers.SGD(
            learning_rate=learning_rate, momentum=momentum,

TimHunt12 avatar Aug 14 '23 22:08 TimHunt12