Summary: This a is non-exhaustive list of references for this component.
Table of Contents
Note: Preprocessing (or Representation Learning) is often incorrectly referred to as "pretraining".
See also Awesome-AutoML-Papers#Automated Feature Engineering.
See also AutoML.
2012
2017
2018
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- Emergence of Invariance and Disentanglement in Deep Representations
- Towards a Definition of Disentangled Representations
- Automating Feature Engineering in Supervised Learning
2019
- ViLBERT: Pretraining Task-Agnostic Visiolinguistic Representations for Vision-and-Language Tasks
- Theory and Evaluation Metrics for Learning Disentangled Representations
2020
- MMFT-BERT: Multimodal Fusion Transformer with BERT Encodings for Visual Question Answering
- Measuring Disentanglement: A Review of Metrics
- An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
2021
- Self-Supervised Structured Representations for Deep Reinforcement Learning
- HuBERT: Self-Supervised Speech Representation Learning by Masked Prediction of Hidden Units
- BEiT: BERT Pre-Training of Image Transformers
- ViLT: Vision-and-Language Transformer Without Convolution or Region Supervision
- Benchmarks, Algorithms, and Metrics for Hierarchical Disentanglement
- Self-Supervised Pretraining Improves Self-Supervised Pretraining
- Task-customized Self-supervised Pre-training with Scalable Dynamic Routing
- AMMUS : A Survey of Transformer-based Pretrained Models in Natural Language Processing
2022
2018
2019
- Multimodal Representation Learning: Advances, Trends and Challenges
- Deep Multimodal Representation Learning: A Survey
2021
- Recent Advances and Trends in Multimodal Deep Learning: A Review
- Exploiting Multimodal Reinforcement Learning for Simultaneous Machine Translation
2022
2019
- SOLAR: Deep Structured Representations for Model-Based Reinforcement Learning
- Task-Agnostic Dynamics Priors for Deep Reinforcement Learning
- Recall Traces: Backtracking Models for Efficient Reinforcement Learning
2021
- Exploiting Multimodal Reinforcement Learning for Simultaneous Machine Translation
- Pretraining Representations for Data-Efficient Reinforcement Learning
2022