DL-Simplified
DL-Simplified copied to clipboard
[Project Addition] : Sarcasm Detection using NLP
Deep Learning Simplified Repository (Proposing new issue)
:red_circle: Project Title : Sarcasm Detection :red_circle: Aim: various deep learning models for detecting sarcasm in text data using TensorFlow and Keras. Each model explores different architectures and techniques to improve sarcasm detection performance. :red_circle: Dataset: The models are designed to work with any text classification dataset.
📍 Follow the Guidelines to Contribute to the Project:
- You need to create a separate folder named the Project Title.
- Inside that folder, there will be four main components.
- Images - To store the required images.
- Dataset - To store the dataset or, information/source about the dataset.
- Model - To store the machine learning model you've created using the dataset.
requirements.txt- This file will contain the required packages/libraries to run the project on other machines.
- Inside the
Modelfolder, theREADME.mdfile must be filled up properly, with proper visualizations and conclusions.
:red_circle::yellow_circle: Points to Note :
- The issues will be assigned on a first come first serve basis, 1 Issue == 1 PR.
- "Issue Title" and "PR Title should be the same. Include the issue number along with it.
- Follow the Contributing Guidelines & Code of Conduct before starting Contributing.
:white_check_mark: To be Mentioned while taking the issue :
- Full name: Bristi Halder
- GitHub Profile Link: https://github.com/bristiHalder
- What is your participant role? GSSoC'24 Contributor
Happy Contributing 🚀
All the best. Enjoy your open-source journey ahead. 😎
Thank you for creating this issue! We'll look into it as soon as possible. Your contributions are highly appreciated! 😊
Hi @bristiHalder can you please share a brief about the models you want to implement here?
@abhisheks008 Sure sure!
- Global Average Pooling
- Embedding layer followed by a GlobalAveragePooling1D layer and Dense layers.
- Stacked Bidirectional LSTM
- A single Bidirectional LSTM layer with Dense layers
- Convolutional layers followed by a Bidirectional LSTM
- CNN-LSTM Hybrid with Batch Normalization
- GRU layers with Dropout for regularization
- An attention layer after Bidirectional LSTM.
@abhisheks008 Sure sure!
- Global Average Pooling
- Embedding layer followed by a GlobalAveragePooling1D layer and Dense layers.
- Stacked Bidirectional LSTM
- A single Bidirectional LSTM layer with Dense layers
- Convolutional layers followed by a Bidirectional LSTM
- CNN-LSTM Hybrid with Batch Normalization
- GRU layers with Dropout for regularization
- An attention layer after Bidirectional LSTM.
Assigned this issue to you @bristiHalder