Representation-Learning-Reading-List icon indicating copy to clipboard operation
Representation-Learning-Reading-List copied to clipboard

Repository lists useful papers related to representation learning with links to research papers, conference names and year of publish.

A reading list of resources to Representation Learning

  • Representation Learning: A Review and New Perspectives. [Paper] [2014]
  • Discriminative unsupervised feature learning with convolutional neural networks. [Paper] [NeurIPS] [2014]
  • Unsupervised Visual Representation Learning by Context Prediction. [Paper] [ICCV] [2015]
  • Learning to see by moving. [Paper] [ICCV] [2015]
  • Unsupervised learning of visual representations using videos. [Paper] [ICCV] [2015]
  • Distilling the knowledge in a neural network. [Paper] [arXiv] [2015]
  • Delving deep into rectifiers: Surpassing humanlevel performance on imagenet classification. [Paper] [ICCV] [2015]
  • Discriminative unsupervised feature learning with exemplar convolutional neural networks. [Paper] [IEEE transactions on Pattern Analysis and Machine Intelligence] [2015]
  • Deep unsupervised exemplar learning. [Paper] [NeurIPS] [2016]
  • Colorful image colorization. [Paper] [ECCV] [2016]
  • Infogan: Interpretable representation learning by information maximizing generative adversarialnets. [Paper] [NeurIPS] [2016]
  • Shuffle and learn: Unsupervised learning using temporal order verification. [Paper] [ECCV] [2016]
  • Unsupervised learning of visual representations by solving jigsaw puzzles. [Paper] [ECCV] [2016]
  • Adversarially learned inference. [Paper] [arXiv preprint arXiv:1606.00704] [2016]
  • Learning deep parsimonious representations. [Paper] [NeurIPS] [2016]
  • Learning visual features from large weakly supervised data. [Paper] [ECCV] [2016]
  • Multi-task Self-Supervised Visual Learning. [Paper] [ICCV] [2017]
  • Unsupervised learning by predicting noise. [Paper] [ICML] [2017]
  • Unsupervised Representation Learning by Sorting Sequences. [Paper] [ICCV] [2017]
  • Self-supervised video representation learning with odd-one-out networks.[Paper] [CVPR] [2017]
  • Predicting deeper into the future of semantic segmentation. [Paper] [ICCV] [2017]
  • Invariant Representations without Adversarial Training. [Paper] [NeurIPS] [2018]
  • Self-supervised learning of geometrically stable features through probabilistic introspection.[Paper] [CVPR] [2018]
  • Mine: Mutual Information Neural Estimation. [Paper] [arXiv preprint] [2018]
  • Improvements to context based self-supervised learning. [Paper] [CVPR] [2018]
  • Unsupervised Feature Learning via Non-Parametric Instance Discrimination. [Paper] [CVPR] [2018]
  • Unsupervised Representation Learning by Predicting Image Rotations. [Paper] [ICLR] [2018]
  • Boosting self-supervised learning via knowledge transfer. [Paper] [CVPR] [2018]
  • A critical analysis of self-supervision, or what we can learn from a single image. [Paper] [arXiv preprint] [2019]
  • Unsupervised Deep Learning by Neighbourhood Discovery. [Paper] [arXiv preprint] [2019]
  • Scaling and benchmarking self-supervised visual representation learning. [Paper] [ICCV] [2019]
  • Self-supervised learning of pretext-invariant representations. [Paper] [arXiv preprint] [2019]
  • Autoaugment: Learning augmentation strategies from data. [Paper] [CVPR] [2019]
  • S4L: Self-Supervised Semi-Supervised Learning. [Paper] [ICCV] [2019]
  • Large scale adversarial representation learning. [Paper] [NeurIPS] [2019]
  • Representation Learning Using Contrastive Predictive Coding. [Paper] [arXiv preprint] [2019]
  • Learning Deep Representations By Mutual Information Estimation and Maximization [Paper] [ICLR] [2019]
  • Learning representations by maximizing mutual information across views. [Paper] [NeurIPS] [2019]
  • Self-supervised representation learning by rotation feature decoupling. [Paper] [CVPR] [2019]
  • Unsupervised embedding learning via invariant and spreading instance feature. [Paper] [CVPR] [2019]
  • AET vs. AED: Unsupervised Representation Learning by Auto-Encoding Transformations rather than Data. [Paper] [2019]
  • Learning Generalized Transformation Equivariant Representations via AutoEncoding Transformations. [Paper] [arXiv] [2019]
  • AE2 Nets: Autoencoder in Autoencoder Networks. [Paper] [CVPR] [2019]
  • Local Aggregation for Unsupervised Learning of Visual Embeddings. [Paper] [ICCV] [2019]
  • Contrastive multiview coding. [Paper] [arXiv] [2019]
  • Revisiting Self-Supervised Visual Representation Learning. [Paper] [CVPR] [2019]
  • Unsupervised pre-training of image features on non-curated data. [Paper] [ICCV] [2019]
  • Unsupervised embedding learning via invariant and spreading instance feature. [Paper] [CVPR] [2019]
  • Self-supervised visual feature learning with deep neural networks. [Paper] [Trans. PAMI] [2019]
  • Phase Transitions For The Information Bottleneck In Representation Learning. [Paper] [ICLR] [2020]
  • A Simple Framework for Contrastive Learning of Visual Representations. [Paper] [ICML] [2020]
  • Momentum Contrast for Unsupervised Visual Representation Learning. [Paper] [CVPR] [2020]
  • Improved baselines with momentum contrastive learning. [Paper] [arXiv preprint] [2020]
  • How Useful Is Self-Supervised Pretraining for Visual Tasks? [Paper] [CVPR] [2020]
  • Prototypical Contrastive Learning of Unsupervised Representations. [Paper] [arXiv] [2020]
  • Evolving Losses for Unsupervised Video Representation Learning. [Paper] [CVPR] [2020]
  • Self-Supervised Learning of Interpretable Keypoints From Unlabelled Videos. [Paper] [CVPR] [2020]
  • Automatic Shortcut Removal for Self-Supervised Representation Learning. [Paper] [ICML] [2020]
  • Unsupervised Image Classification for Deep Representation Learning. [Paper] [ECCV] [2020]
  • Understanding Contrastive Representation Learning through Alignment and Uniformity on the Hypersphere. [Paper] [ICML] [2020]
  • Deep Isometric Learning for Visual Recognition. [Paper] [ICML] [2020]
  • Learning De-biased Representations with Biased Representations. [Paper] [ICML] [2020]
  • Data-Efficient Image Recognition with Contrastive Predictive Coding. [Paper] [ICML] [2020]
  • When Does Self-Supervision Help Graph Convolutional Networks? [Paper] [ICML] [2020]
  • Adaptive Adversarial Multi-task Representation Learning. [Paper] [ICML] [2020]
  • Parametric Instance Classification for Unsupervised Visual Feature Learning. [Paper] [NeurIPS] [2020]
  • Big Self-Supervised Models are Strong Semi-Supervised Learners. [Paper] [NeurIPS] [2020]
  • Self-supervised Learning: Generative or Contrastive. [Paper] [arXiv] [2020]
  • Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning. [Paper] [arXiv] [2020]
  • What makes for good views for contrastive learning. [Paper] [arXiv] [2020]
  • Generative Pretraining from Pixels. [Paper] [ICML] [2020]
  • Self-supervised Co-training for Video Representation Learning. [Paper][NeurIPS][2020]
  • Contrastive Learning with Hard Negative Samples. [Paper][arXiv][2020]
  • Contrastive Representation Learning: A Framework and Review. [Paper][IEEE Access][2020]
  • Hard Negative Mixing for Contrastive Learning. [Paper] [NeurIPS][2020]
  • Representation learning from videos in-the-wild: An object-centric approach. [Paper][arXiv][2020]
  • On the surprising similarities between supervised and self-supervised models. [Paper][arXiv][2020]
  • Understanding Self-supervised Learning with Dual Deep Networks. [Paper][arXiv][2020]
  • Adversarial Self-Supervised Contrastive Learning. [Paper][NeurIPS][2020]
  • Self-Supervised Relationship Probing. [Paper][NeurIPS][2020]
  • Self-Supervised Learning by Cross-Modal Audio-Video Clustering. [Paper][NeurIPS][2020]
  • Self-Supervised Generative Adversarial Compression. [Paper][NeurIPS][2020]
  • Unsupervised Representation Learning by Invariance Propagation. [Paper][NeurIPS][2020]
  • Unsupervised Data Augmentation for Consistency Training. [Paper][NeurIPS][2020]
  • Unsupervised Semantic Aggregation and Deformable Template Matching for Semi-Supervised Learning. [Paper][NeurIPS][2020]
  • Self-Supervised Visual Representation Learning from Hierarchical Grouping. [Paper][NeurIPS][2020]
  • Self-training with Noisy Student improves ImageNet classification. [Paper][CVPR][2020]
  • Boosting Contrastive Self-Supervised Learning with False Negative Cancellation. [Paper][arXiv][2020]
  • ISD: Self-Supervised Learning by Iterative Similarity Distillation. [Paper][arXiv][2020]
  • Dense Contrastive Learning for Self-Supervised Visual Pre-Training. [Paper][CVPR][2021]
  • SEED: Self-supervised Distillation For Visual Representation. [Paper][ICLR][2021]
  • Dual-stream Multiple Instance Learning Network for Whole Slide Image Classification with Self-supervised Contrastive Learning. [Paper][CVPR][2021]
  • Self-supervised Video Representation Learning by Context and Motion Decoupling. [Paper][CVPR][2021]
  • Self-supervised Motion Learning from Static Images. [Paper][CVPR][2021]
  • Self-Supervised Learning Across Domains. [Paper][T-PAMI][2021]
  • Neighbor2Neighbor: Self-Supervised Denoising from Single Noisy Images. [Paper][CVPR][2021]
  • How Well Do Self-Supervised Models Transfer?[Paper][CVPR][2021]
  • Barlow Twins: Self-Supervised Learning via Redundancy Reduction. [Paper][arXiv][2021]
  • Graph Self-Supervised Learning: A Survey. [Paper][IJCAI Paper Track][2021]
  • With a Little Help from My Friends: Nearest-Neighbor Contrastive Learning of Visual Representations. [Paper][arXiv][2021]
  • OBoW: Online Bag-of-Visual-Words Generation for Self-Supervised Learning. [Paper][CVPR][2021]
  • Emerging properties in self-supervised vision transformers. [Paper][arXiv][2021]

Clustering

  • Mean Shift: A Robust Approach Towards Feature Space Analysis. [Paper] [Trans.PAMI] [2002]
  • Semi-Supervised Kernel Mean Shift Clustering. [Paper] [Trans.PAMI] [2014]
  • Neural Network-based Clustering Using Pairwise Constraints. [Paper] [ICLR] [2016]
  • Regularization with stochastic transformations and perturbations for deep semi-supervised learning. [Paper] [NeurIPS] [2016]
  • Unsupervised Deep Embedding for Clustering Analysis. [Paper] [ICML] [2016]
  • Joint Unsupervised Learning of Deep Representations and Image Clusters [Paper] [CVPR] [2016]
  • Improved deep embedded clustering with local structure preservation. [Paper] [IJCAI] [2017]
  • Learning Discrete Representations via Information Maximizing Self-Augmented Training. [Paper] [JMLR] [2017]
  • Revisiting unreasonable effectiveness of data in deep learning era. [Paper] [ICCV] [2017]
  • Towards k-means-friendly spaces:Simultaneous deep learning and clustering. [Paper] [ICML] [2017]
  • Deep adaptive image clustering. [Paper] [ICCV] [2017]
  • Semi-Supervised Clustering with Neural Networks. [Paper] [arXiv] [2018]
  • Discriminatively boosted image clustering with fully convolutional auto-encoders. [Paper] [Pattern Recognition ] [2018]
  • Deep Clustering for Unsupervised Learning of Visual Features. [Paper] [ECCV] [2018]
  • Clustering by Directly Disentangling Latent Space. [Paper] [CVPR] [2019]
  • ClusterGAN : Latent Space Clustering in Generative Adversarial Networks. [Paper] [AAAI] [2019]
  • Deep Spectral Clustering using Dual Autoencoder Network. [Paper] [CVPR] [2019]
  • Deep Clustering for Unsupervised Learning of Visual Features. [Paper] [CVPR] [2019]
  • N2D: (Not Too) Deep Clustering via Clustering the Local Manifold of an Autoencoded Embedding. [Paper] [arXiv] [2019]
  • Deep Comprehensive Correlation Mining for Image Clustering. [Paper] [ICCV] [2019]
  • Robust Embedded Deep K-means Clustering. [Paper] [ACM International Conference on Information and Knowledge Management.] [2019]
  • Invariant Information Clustering for Unsupervised Image Classification and Segmentation: Supplementary Material. [Paper] [ICCV] [2019]
  • Unsupervised Clustering using Pseudo-semi-supervised Learning. [Paper] [ICLR] [2020]
  • Self-labelling via simultaneous clustering and representation learning. [Paper] [ICLR] [2020]
  • Learning To Classify Images Without Labels. [Paper] [ECCV] [2020]
  • Unsupervised Learning of Visual Features by Contrasting Cluster Assignments. [Paper] [arXiv] [2020]
  • Dissimilarity Mixture Autoencoder for Deep Clustering. [Paper] [arXiv] [2020]
  • Online Deep Clustering for Unsupervised Representation Learning. [Paper] [CVPR] [2020]
  • End-to-End Adversarial-Attention Network for Multi-Modal Clustering. [Paper] [CVPR] [2020]
  • Deep Robust Clustering by Contrastive Learning. [Paper][arXiv][2020]
  • Deep Transformation-Invariant Clustering. [Paper][NeurIPS][2020]
  • Scalable Bottom-Up Hierarchical Clustering. [[Paper(https://arxiv.org/abs/2010.11821v1)][arXiv][2020]
  • Multi-Modal Deep Clustering: Unsupervised Partitioning of Images. [Paper][ICPR][2020]
  • Data Structures & Algorithms for Exact Inference in Hierarchical Clustering. [Paper][arXiv][2020]
  • Deep Clustering and Representation Learning that Preserves Geometric Structures. [Paper][arXiv][2020]
  • Adversarial Learning for Robust Deep Clustering. [Paper][NeurIPS][2020]
  • Consensus Clustering With Unsupervised Representation Learning. [Paper][arXiv][2020]

Semi-Supervised Learning

  • Semi-Supervised Learning with Ladder Networks. [Paper][NeurIPS][2015]
  • Semi-Supervised Learning with Generative Adversarial Networks. [Paper][arXiv][2016]
  • Temporal ensembling for semi-supervised learning. [Paper)][ICLR][2016]
  • Regularization With Stochastic Transformations and Perturbations for Deep Semi-Supervised Learning. [Paper][NeurIPS][2016]
  • Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results. [Paper][ICLR][2017]
  • Virtual adversarial training: a regularization method for supervised and semi-supervised learning. [Paper][IEEE transactions on pattern analysis and machine intelligence][2018]
  • Smooth Neighbors on Teacher Graphs for Semi-Supervised Learning. [Paper][CVPR][2018]
  • Realistic Evaluation of Deep Semi-Supervised Learning Algorithms. [Paper][NeurIPS][2018]
  • Deep Co-Training for Semi-Supervised Image Recognition. [Paper][ECCV][2018]
  • Transductive Semi-Supervised Deep Learning using Min-Max Features. [Paper][ECCV][2018]
  • Semi-Supervised Deep Learning with Memory. [Paper][ECCV][2018]
  • MixMatch: A Holistic Approach to Semi-Supervised Learning.[Paper][NeurIPS][2019]
  • Pseudo-labeling and confirmation bias in deep semi-supervised learning. [Paper][IJCNN][2019]
  • Interpolation Consistency Training for Semi-Supervised Learning. [Paper][IJCAI][2019]
  • Label Propagation for Deep Semi-supervised Learning. [Paper][CVPR][2019]
  • FixMatch: Simplifying Semi-Supervised Learning with Consistency and Confidence. [Paper][NeurIPS][2020]
  • DivideMix: Learning with Noisy Labels as Semi-supervised Learning. [Paper][ICLR][2020]
  • ReMixMatch: Semi-Supervised Learning with Distribution Alignment and Augmentation Anchoring. [Paper][ICLR][2020]
  • CoMatch: Semi-supervised Learning with Contrastive Graph Regularization. [Paper][arXiv][2021]
  • Semi-Supervised Action Recognition with Temporal Contrastive Learning. [Paper][CVPR][2021]
  • Sharpness-aware Minimization for Efficiently Improving Generalization. [Paper][ICLR][2021]
  • Meta Pseudo Labels. [Paper][CVPR][2021]

Data Augmentation

  • mixup: Beyond Empirical Risk Minimization. [Paper][ICLR][2018]
  • Manifold Mixup: Better Representations by Interpolating Hidden States. [Paper][ICML][2019]
  • MixUp as Locally Linear Out-Of-Manifold Regularization. [Paper][AAAI][2019]
  • Improved Regularization of Convolutional Neural Networks with Cutout. [Paper][arxiv][2017]
  • CutMix: Regularization Strategy to Train Strong Classifiers with Localizable Features. [Paper][ICCV][2019]
  • Puzzle Mix: Exploiting Saliency and Local Statistics for Optimal Mixup. [Paper][ICML][2020]
  • Attentive Cutmix: An Enhanced Data Augmentation Approach for Deep Learning Based Image Classification. [Paper][ICASSP][2020]
  • How Does Mixup Help With Robustness and Generalization? [Paper][ICLR][2021]

Disentanglement

  • Disentangling Factors of Variation by Mixing Them. [Paper] [CVPR] [2018]
  • Weakly Supervised Disentanglement With Guarantees. [Paper] [ICLE] [2020]

Metric Learning

  • Improved deep metric learning with multi-class n-pair loss objective. [Paper] [NeurIPS] [2016]
  • Deep Metric Learning with Tuplet Margin Loss. [Paper] [ICCV] [2019]
  • SoftTriple Loss: Deep Metric Learning Without Triplet Sampling. [Paper] [ICCV] [2019]
  • Supervised Contrastive Learning. [Paper] [arXiv preprint] [2020]
  • Proxy Anchor Loss for Deep Metric Learning. [Paper] [CVPR] [2020]

Book Chapters

  • Representation Learning. Deeplearning. Book Chapter 15. [Book]