DL-Simplified
DL-Simplified copied to clipboard
Adverse Weather Synthetic Segmentation
Deep Learning Simplified Repository (Proposing new issue)
:red_circle: Project Title : Adverse Weather Synthetic Segmentation :red_circle: Aim : Create a DL model for segmenting the adverse weather synthetic condition. :red_circle: Dataset : https://www.kaggle.com/datasets/abdulrahmankerim/semantic-segmentation-under-adverse-conditions :red_circle: Approach : Try to use 3-4 algorithms to implement the models and compare all the algorithms to find out the best fitted algorithm for the model by checking the accuracy scores. Also do not forget to do a exploratory data analysis before creating any model.
📍 Follow the Guidelines to Contribute in the Project :
- You need to create a separate folder named as the Project Title.
- Inside that folder, there will be four main components.
- Images - To store the required images.
- Dataset - To store the dataset or, information/source about the dataset.
- Model - To store the machine learning model you've created using the dataset.
-
requirements.txt
- This file will contain the required packages/libraries to run the project in other machines.
- Inside the
Model
folder, theREADME.md
file must be filled up properly, with proper visualizations and conclusions.
:red_circle::yellow_circle: Points to Note :
- The issues will be assigned on a first come first serve basis, 1 Issue == 1 PR.
- "Issue Title" and "PR Title should be the same. Include issue number along with it.
- Follow Contributing Guidelines & Code of Conduct before start Contributing.
:white_check_mark: To be Mentioned while taking the issue :
- Full name :
- GitHub Profile Link :
- Email ID :
- Participant ID (if applicable):
- Approach for this Project :
- What is your participant role? (Mention the Open Source program)
Happy Contributing 🚀
All the best. Enjoy your open source journey ahead. 😎
- Full Name: Rohan Kurien Thomas
- GitHub Profile Link: https://github.com/rohankthomas801
- Email ID: [email protected]
- Participant ID: SSOC Season 2
-
Approach for this Project:
- Data Collection- Download the given dataset
- Data Preprocessing- I will first confirm that the images are of a consistent size. Then I will normalize pixel values and split the images into training, validation, and test sets. Then I will check that the distribution of different weather conditions is balanced across the sets to avoid bias.
- Data Augmentation- I can also apply data augmentation techniques to increase the diversity of the training dataset. This can involve random transformations such as rotation, scaling, and flipping the images.
- Model Selection- This project requires semantic segmentation. Models like U-Net, DeepLabV3+, or FCN which are commonly used for image segmentation will be used and compared.
- Model Training- Loss functions used for semantic segmentation include Binary Cross-Entropy loss, and Dice Loss, both of which can be used for training the model.
- Model Evaluation- Many types of metrics are considered. For example the Intersection over Union (IoU) and dice score metrics.
- Participant Role: Social Summer of Code 2023- Contributor
Issue assigned to you @rohankthomas801
Hey can I get this issue? I have experience in segmentation so I will be using unet,unet++(depends on dataset and my machine),deeplab,segnet,enet
Assigned @CoderOMaster
This dataset is made specifically for a research paper.Will it work if I implement that ?
This dataset is made specifically for a research paper.Will it work if I implement that ?
Implement the models and let me know about that, I'll check that for you.
@abhisheks008 i just checked and found they have used another dataset which is quite huge and idts it will be possible to run on our machines,in that case what should i do
can i use cityscape dataset for segmentation..i did not find any suitable dataset for weather
https://www.kaggle.com/datasets/vijaygiitk/multiclass-weather-dataset or i can use this weather dataset for perform weather classfication
Go ahead.
Hello @CoderOMaster! Your issue #220 has been closed. Thank you for your contribution!