ee046211-deep-learning
ee046211-deep-learning copied to clipboard
Jupyter Notebook tutorials for ECE 046211 Deep Learning course at the Technion
ee046211-deep-learning
Technion ECE 046211 - Deep Learning
Jupyter Notebook tutorials for the Technion's ECE 046211 course "Deep Learning"
Student Projects Website • Video Tutorials (Winter 2024)
- ee046211-deep-learning
- Agenda
- Running The Notebooks
- Running Online
- Running Locally
- Installation Instructions
- Libraries to Install
Agenda
| File | Topics Covered | Video |
|---|---|---|
Setting Up The Working Environment.pdf |
Guide for installing Anaconda locally with Python 3 and PyTorch, integration with PyCharm and using GPU on Google Colab | - |
ee046211_tutorial_01_machine_learning_recap.ipynb/pdf |
Supervised and Unsupervised Learning, Model Evaluation, Bias-Variance Tradeoff, Feature Scaling, Linear Regression, Gradient Descent, Regularization (Ridge, LASSO) | Video Link |
ee046211_tutorial_02_single_neuron_recap.ipynb/pdf |
Discriminative models, Perceptron, Logistic Regression (also in PyTorch), Softmax Regression, Activation functions | Video Link |
ee046211_tutorial_03_optimization_gradient_descent.ipynb/pdf |
Unimodal functions, Convexity, Hessain, Gradient Descent, SGD, Learning Rate, LR Scheculing / Annealing, Momentum, Nesterov Momentum, Adaptive Learning Rate Methods, Adagrad, RMSprop, Adam, AdaBelief, MADGRAD, Adan, Schedule-free Optimization (SGD, Adam) | Video Link - Part 1 Video Link - Part 2 |
ee046211_tutorial_04_differentiation_autograd.ipynb/pdf |
Lagrange Multipliers, Automatic Differentiation (AutoDiff) Forward Mode and Reverese Mode, PyTorch Autograd | Video Link |
ee046211_tutorial_05_multilayer_nn.ipynb/pdf |
Multi-Layer Perceptron (MLP), Backpropagation, Neural Netwroks in PyTorch, Weights Initialization - Xavier (Glorot), Kaiming (He), Deep Double Descent | Video Link |
ee046211_tutorial_06_convnets_visual_tasks.ipynb/pdf |
2D Convolution (Cross-correlation), Convolution-based Classification, Convolutional Neural Networks (CNNs), Regularization and Overfitting, Dropout, Data Augmentation, CIFAR-10 dataset, Visualizing Filters, Applications of CNNs, The problems with CNNs (adversarial attacks, poor generalization, fairness-undesirable biases) | Video Link - Part 1 Video Link - Part 2 |
ee046211_tutorial_07_sequential_tasks_rnn.ipynb/pdf |
Sequential Tasks, Natural Language Processing (NLP), Language Model, Perplexity, BLEU, Recurrent Neural Network (RNN), Backpropagation Through Time (BPTT), Long Term Short Memory (LSTM), Gated Recurrent Unit (GRU), RWKV, xLSTM, Multi-head Self-Attention, Transformer, BERT and GPT, Teacher Forcing, torchtext, Sentiment Analysis, Transformers Warmup, Intialization, GLU variants, Pre-norm and Post-norm, RMSNorm, SandwichNorm, ReZero, Rectified Adam (RAdam), Relative Positional Encoding/Embedding | Video Link - Part 1 Video Link - Part 2 Video Link - Part 3 |
ee046211_tutorial_08_training_methods.ipynb/pdf |
Feature Scaling, Normalization, Standardization, Batch Normalization, Layer Normalization, Instance Normalization, Group Normalization, Vanishing Gradients, Exploding Gradients, Skip-Connection, Residual Nlock, ResNet, DenseNet, U-Net, Hyper-parameter Tuning: Grid Search, Random Search, Bayesian Tuning, Optuna with PyTorch | Video Link Video Link - Optuna Tutorial |
ee046211_tutorial_09_self_supervised_representation_learning.ipynb/pdf |
Transfer Learning, Domain Adaptation, Pre-trained Networks, Sim2Real, BERT, Low-rank Adaptation - LoRA, DoRA, Representation Learning, Self-Supervised Learning, Autoencoders, Contrastive Learning, Contrastive Predictive Coding (CPC), Simple Framework for Contrastive Learning of Visual Representations (SimCLR), Momentum Contrast (MoCo), Bootstrap Your Own Latent (BYOL), DINO, CLIP | Video Link - Part 1 - Transfer Learning Video Link - Part 2 - Self-supervised Learning |
ee046211_tutorial_10_compression_pruning_amp.ipynb/pdf |
Resource Efficiency in DL, Automatic Mixed Precision (AMP), Quantization (Dynamic, Static), Quantization Aware Training (QAT), LLM Quantization, Pruning, The Lottery Ticket Hypothesis | Video Link |
pytorch_maximize_cpu_gpu_utilization.ipynb/pdf |
Tips and Tricks for efficient coding in PyTorch, Maximizing the CPU and GPU utilization, nvidia-smi, PyTorch Profiler, AMP, Multi-GPU training, HF Accelerate, RL libraries |
Video Link |
Running The Notebooks
You can view the tutorials online or download and run locally.
Running Online
| Service | Usage |
|---|---|
| Jupyter Nbviewer | Render and view the notebooks (can not edit) |
| Binder | Render, view and edit the notebooks (limited time) |
| Google Colab | Render, view, edit and save the notebooks to Google Drive (limited time) |
Jupyter Nbviewer:
Press on the "Open in Colab" button below to use Google Colab:
Or press on the "launch binder" button below to launch in Binder:
Note: creating the Binder instance takes about ~5-10 minutes, so be patient
Running Locally
Press "Download ZIP" under the green button Clone or download or use git to clone the repository using the
following command: git clone https://github.com/taldatech/ee046211-deep-learning.git (in cmd/PowerShell in Windows or in the Terminal in Linux/Mac)
Open the folder in Jupyter Notebook (it is recommended to use Anaconda). Installation instructions can be found in Setting Up The Working Environment.pdf.
Installation Instructions
For the complete guide, with step-by-step images, please consult Setting Up The Working Environment.pdf
- Get Anaconda with Python 3, follow the instructions according to your OS (Windows/Mac/Linux) at: https://www.anaconda.com/download
- Install the basic packages using the provided
environment.ymlfile by running:conda env create -f environment.ymlwhich will create a new conda environment nameddeep_learn. If you did this, you will only need to install PyTorch, see the table below. - Alternatively, you can create a new environment for the course and install packages from scratch:
In Windows open
Anaconda Promptfrom the start menu, in Mac/Linux open the terminal and runconda create --name deep_learn. Full guide at https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html#creating-an-environment-with-commands - To activate the environment, open the terminal (or
Anaconda Promptin Windows) and runconda activate deep_learn - Install the required libraries according to the table below (to search for a specific library and the corresponding command you can also look at https://anaconda.org/)
Libraries to Install
| Library | Command to Run |
|---|---|
Jupyter Notebook |
conda install -c conda-forge notebook |
numpy |
conda install -c conda-forge numpy |
matplotlib |
conda install -c conda-forge matplotlib |
pandas |
conda install -c conda-forge pandas |
scipy |
conda install -c anaconda scipy |
scikit-learn |
conda install -c conda-forge scikit-learn |
seaborn |
conda install -c conda-forge seaborn |
tqdm |
conda install -c conda-forge tqdm |
opencv |
conda install -c conda-forge opencv |
optuna |
pip install optuna |
pytorch (cpu) |
conda install pytorch torchvision torchaudio cpuonly -c pytorch (get command from PyTorch.org) |
pytorch (gpu) |
conda install pytorch torchvision torchaudio pytorch-cuda=11.8 -c pytorch -c nvidia (get command from PyTorch.org) |
torchtext |
conda install -c pytorch torchtext |
torchdata |
conda install -c pytorch torchdata + pip install portalocker |
- To open the notebooks, open Ananconda Navigator or run
jupyter notebookin the terminal (orAnaconda Promptin Windows) while thedeep_learnenvironment is activated.