Tutorial Time and LocationTime: 9:00am-12:30pm PDT, July 10, 2022
- A tutorial abstract in the conference proceeding with brief description of the tutorial content [PDF]
- Tutorial slides [slides]
- Tutorial video [video]
- A comprehensive survey paper with detailed introduction and discussion of constrastive learning for NLP [arXiv]
- A paper reading list of constrastive learning for NLP [Github repo]
Current NLP models heavily rely on effective representation learning algorithms. Contrastive learning is one such technique to learn an embedding space such that similar data sample pairs have close representations while dissimilar samples stay far apart from each other. It can be used in supervised or unsupervised settings using different loss functions to produce task-specific or general-purpose representations. While it has originally enabled the success for vision tasks, recent years have seen a growing number of publications in contrastive NLP. This first line of works not only delivers promising performance improvements in various NLP tasks, but also provides desired characteristics such as task-agnostic sentence representation, faithful text generation, data-efficient learning in zero-shot and few-shot settings, interpretability and explainability.
In this tutorial, we aim to provide a gentle introduction to the fundamentals of contrastive learning approaches and the theory behind them. We then survey the benefits and the best practices of contrastive learning for various downstream NLP applications including Text Classification, Question Answering, Summarization, Text Generation, Interpretability and Explainability, Commonsense Knowledge and Reasoning, Vision-and-Language.
This tutorial intends to help researchers in the NLP and computational linguistics community to understand this emerging topic and promote future research directions of using contrastive learning for NLP applications.
- Part 1: Foundations of Contrastive Learning
- Contrastive Learning Objectives
- Contrastive Data Sampling and Augmentation Strategies
- Successful Applications
- Analysis of Contrastive Learning
- Part 2: Contrastive Learning for NLP
- Contrastive Learning in NLP Tasks
- Task-agnostics Representation
- Faithful Text Generation
- Data-efficient Learning
- Interpretability and Explainability
- Part 3: Lessons Learned, Practical Advice, and Future Directions
- Lessons Learned
- Practical Advice
- Future Directions