DeepLearning-Study
DeepLearning-Study copied to clipboard
This is repository for DeepLearning Study in Kyung Hee University
trafficstars
DeepLearning Basic Study
This is repository for Deep Learning Study in Kyung Hee University Computer Engineering Club D.COM.
Recommend this study to those who want to review the Machine Learning concept again and to those who have just learned Python.
- I've created a course material that will be accessible to the first person to start Python.
- Tae Hwan Jung(@graykode) will lead this Study with Pytorch for DeepLearning Framework. But I will implement Tensorflow, Pytorch, Keras for beginner.
- We deal with basic mathematical theory and basic models in Deep Learning such as
DNN, CNN, RNN, LSTMin 1st Study All of Code were implemented with less than 30 lines. - We will use
Google ColaboratoryGPU for memory resource so you can run easily in Colab link.(Thank for Google!) - First, I made lecture with page link material in Korean, only wrote Contents in English
Contribution Guide
If you find English link or helpful link irrespective of language, Please give me contribution in README, Markdown like this.
Linear Regression([Eng[(your contribution link), Kor)
Curriculum
Please see down Contents.
- 1 Weeks
- Basic Probability Review
- Supervisor Learning vs. Un-supervisor Learning
- Linear Regression, Logistic Regression
manualGradient Descent implementation usingpure python
- 2 Weeks
- method using Google Colaboratory.
- Linear Regression, Logistic Regression Review, Convert
manualtoautoimplementation usingPytorch
- 3 Weeks
- Classification with DNN(Deep Neural Network) in
Pytorch - apply Regularization(DropOut) concept to DNN
- Optimization function in
Pytorch, mini-batch, SGD, Adagrad, RMSProp, AdaDelta, Adam optimizer
- Classification with DNN(Deep Neural Network) in
- 4 Weeks
- Basic Convolution Neural Network
- load dataset and use data loader with
torchvision - apply Machine Learning Diagnostic(Train Set, Cross Validation Set, Test Set) concept to DNN
- Implementation MNIST Classification using CNN
- 5 Weeks
- Basic RNN(Recurrent Neural Network) and LSTM in Pytorch
- Teacher Forcing vs. No Teacher Forcing
- Practice : Predict Next word using RNN or LSTM
- 6 Weeks - Hackathon
Contents
0. Review Basic mathematical Theory with pure Python
-
Supervisor Learning vs. Unsupervisor Learning : In this Study, We will deal with only supervisor concept model.
-
Basic Probability Review
-
Linear Regression(Eng, Kor)
-
Logistic Regression
- What is different with Linear Regression?(Eng, Kor)
- loss function and activation function in Logistic Regression
- Gradient Descent in Logistic Regression
- different with binary classification and multi classification(sigmoid vs. Softmax)(Eng, Kor1, Kor2)
- different with Multi-Classification and Multi-labels Classification(Eng, Kor)
-
Optimizing
- What is batch and mini-batch?(Eng, Kor)
- role of Momentum(Eng, Kor)
- SGD, Adagrad, RMSProp, AdaDelta, Adam optimizer(Eng, Kor) : 2.DNN-Optimization.py
-
Regularization
-
Machine Learning Diagnostic
1.DeepLearning FrameWork Basic
- Abstract Model using Pytorch Class : 1.Pytorch-Basic.py
- method using Google Colaboratory
- Convert
manual gradient descenttoauto graident descent
2.DNN(Deep Neural Network)
- Mathematical Back Propagation in Deep Neural Network(Eng, Kor1, Kor2)
- Basic Classification using Deep Neural Network
- ~~Classification : Linear Regression in Deep Neural Network~~
- Classification : Logistic Regression in Deep Neural Network
- 1 Layer Classification : 2.DNN-LinearRegression1.py
- 2 Layers Classification : 2.DNN-LinearRegression2.py
- Dropout in Deep Neural Network : 2.DNN-Dropout.py
3.DataLoader and basic Dataset and Image handler
- MNIST : 3.DataLoader-MNIST.py
- Cifar10 : 3.DataLoader-Cifar10.py
- Cifar100 : 3.DataLoader-Cifar100.py
- Image Folder : 3.DataLoader-ImageFolder.py
4.CNN(Convolution Neural Network)
- awesome lecture
- Structure of CNN
- 4.CNN-Introduce.py
- Convolutional Layer
- Role of filter(=kernel) vs. receptive fields
- Role of Padding
- Weight sharing in Convolutional Layer
- Role of Channel, Reason using Multi Channel
- Weight sharing in CNN
- Pooling Layer
- Max Pooling
- Average Pooling
- FeedForward in Convolution Neural Network
- Mathematical Back Propagation in Convolution Neural Network
- Practice : Classification MNIST
5.RNN(Recurrent Neural Network)
- awesome lecture
- Structure of RNN
- 5.RNN-Introduce.py
- One-to-one vs. One-to-many vs. Many-to-one vs. Many-to-many
- Hidden State
- Output Layer
- Weight sharing in RNN
- Teacher Forcing vs. No Teacher Forcing
- FeedForward in Recurrent Neural Network(Eng, Kor)
- Mathematical Back Propagation in Recurrent Neural Network(Eng, Kor)
- Practice : Predict Next word using RNN
6.LSTM(Long Short Term Memory)
- Structure of LSTM
- 6.LSTM-Introduce.py
- Hidden State, Cell State
- Different of RNN with LSTM
- Output Layer
- Weight sharing in RNN
- FeedForward in LSTM(Eng, Kor)
- Mathematical Back Propagation in LSTM(Eng, Kor)
- Bi-directional LSTM(BiLSTM)(Eng, Kor)
- Practice : LSTM-AutoComplete with LSTM
7. Application Level
- Vision : Cat or Dog Image Classification.
- Natural Language Processing : Positive or Negative Classification with Naver Movie Review.
Reference
- Andrew NG - Machine Learning Lecture
- Korean Andrew Ng NoteBook : WikiBook
Author
- Tae Hwan Jung(Jeff Jung) @graykode
- Author Email : [email protected]