Tutorial Time and Location
Location: Columbia A + ZoomTime: 2:00-5:30pm PDT, July 10, 2022
Zoom Q&A sessions: 1:30 - 2:00pm, 6:00 - 6:45pm PDT, July 10, 2022
Tutorial Materials
- Tutorial abstract in the conference proceeding [PDF]
- Tutorial slides [slides]
- Tutorial video [video]
- Paper reading list of contrastive learning for NLP [Github]
Abstract
Current NLP models heavily rely on effective representation learning algorithms. Contrastive learning is one such technique to learn an embedding space such that similar data sample pairs have close representations while dissimilar samples stay far apart from each other. It can be used in supervised or unsupervised settings using different loss functions to produce task-specific or general-purpose representations. While it has originally enabled the success for vision tasks, recent years have seen a growing number of publications in contrastive NLP. This first line of works not only delivers promising performance improvements in various NLP tasks, but also provides desired characteristics such as task-agnostic sentence representation, faithful text generation, data-efficient learning in zero-shot and few-shot settings, interpretability and explainability.
In this tutorial, we aim to provide a gentle introduction to the fundamentals of contrastive learning approaches and the theory behind them. We then survey the benefits and the best practices of contrastive learning for various downstream NLP applications including Text Classification, Question Answering, Summarization, Text Generation, Interpretability and Explainability, Commonsense Knowledge and Reasoning, Vision-and-Language.
This tutorial intends to help researchers in the NLP and computational linguistics community to understand this emerging topic and promote future research directions of using contrastive learning for NLP applications.
Outline
- Part 1: Foundations of Contrastive Learning
- Contrastive Learning Objectives
- Contrastive Data Sampling and Augmentation Strategies
- Successful Applications
- Analysis of Contrastive Learning
- Part 2: Contrastive Learning for NLP
- Contrastive Learning in NLP Tasks
- Task-agnostics Representation
- Faithful Text Generation
- Data-efficient Learning
- Interpretability and Explainability
- Part 3: Lessons Learned, Practical Advice, and Future Directions
- Lessons Learned
- Practical Advice
- Future Directions