deep-learning-for-biology-hse-2018-course icon indicating copy to clipboard operation
deep-learning-for-biology-hse-2018-course copied to clipboard

Deep Learning for Biology course materials / Higher School of Economics, 2018

Deep Learning for Biology course materials / HSE 2018

This is a repository of course materials for the Deep Learning for Biology course.

The course was taught Fall 2018 at Higher School of Economics (Moscow), Faculty of Computer Science, Master’s Programme 'Data Analysis in Biology and Medicine'.

The contents

  • Course slides
  • Course Jupyter notebooks (using Keras)
  • List of articles for Journal club

Syllabus

1. Artificial Intelligence: Current state and Overview

  • Short history
  • Current results in Deep Learning
  • Images and Video
  • Speech and Sound
  • Text and Language
  • Robotic control
  • ML for systems
  • Problems with DL
  • Other approaches to AI
  • Knowledge and Representation
  • Symbolic approaches
  • Evolutionary computations and Swarm intelligence
  • Hardware

2. Introduction to Neural Networks

  • Intro into NN: neuron, neural network, backpropagation,
  • Feed-forward NNs (FNN)
  • Autoencoders (AE)

3. Keras practice

  • Notebook: Keras Intro (FFN: Binary classification, Multi-class classification, Regression)
  • Notebook: Autoencoders
  • Notebook: Variational autoencoder

4. Convolutional NNs (CNN) and Image processing

  • DL for computer vision cases
  • CNNs
  • Keras practice. Notebook: CNN for classification, CNN autoencoders, Visualizing CNNs: Saliency maps, grad-CAM, FCNs
  • Keras practice. Notebook: Playing with autoencoders

5-6. Real-life modern CNNs

  • Activations, Regularization, Augmentation, etc
  • Models: LeNet, AlexNet, VGG, GoogLeNet, Inception, ResNet, DenseNet, XCeption
  • How to use pretrained models in Keras. Notebook: using pretrained CNN models

7. Transfer Learning

  • Theory
  • Keras practice. Notebook: Transfer learning using VGG

8. Advanced CNNs

  • 1D, 3D, dilated convolutions
  • Detection: R-CNN, Fast R-CNN, Faster R-CNN, YOLO
  • Fully-convolutional CNNs (FCNs)
  • Deconvolutional networks (Transposed convolution)
  • Generative Adversarial Networks (GANs)
  • Style Transfer
  • Notebook: FCN example, classification using only convolutions

9. Recurrent NNs (RNNs)

  • RNN basics, Backpropagation through time
  • Long short-term memory (LSTM)
  • Advanced RNNs: Bidirectional RNNs, Multidimensional RNNs

10. Practice: Generating text using RNNs

  • Keras example. Notebook: Text generation

11. Practice: Text classification using RNNs

  • Working with texts: vectorizing, one-hot encoding, word embeddings, word2vec etc
  • Keras example: sentence-based classification using RNN/LSTM/BLSTM
  • Keras example: sentence-based classification using 1D CNN
  • Keras example: sentence-based classification using RNN+CNN
  • Notebook with examples

12. Sequence Learning (seq2seq)

  • Multimodal Learning
  • Seq2seq
  • Encoder-Decoder
  • Beam search
  • Attention mechanisms, Visualizing attention, Hard and Soft attention, Self-Attention
  • Augmented RNNs
  • Connectionist Temporal Classification (CTC)
  • Non-RNN Sequence Learning, problems with RNNs
  • Convolutional Sequence Learning
  • Self-Attention Neural Networks (SAN): Transformer Architecture
  • Transformer: The next steps (Image Transformer, BERT, Universal Transformer)