What is “in-context learning”?

Angelina Yang
3 min readAug 27, 2023

Welcome to today’s data science interview challenge! Today’s challenge is inspired by Natural Language Understanding (NLU) lecture given by Professor and Chair, Christopher Potts from Department of Linguistics at Stanford. Here it goes:

Question 1: What is “in-context learning” in the context of Large Language Models (LLMs)?

Question 2: Can you contrast standard supervision learning vs few-shot in-context learning?

Source: A Survey on In-context Learning (2023)

Here are some tips for readers’ reference:

Question 1:

In-context learning (ICL) is a paradigm that allows large language models (LLMs) to learn tasks given only a few examples in the form of demonstration. This is in contrast to traditional machine learning, where models are trained on large datasets of labeled data.

The key idea behind ICL is that LLMs can learn to generalize from a few examples by identifying the underlying relationships between the examples. This is done by using the model’s ability to understand the context of the examples.

For example, an LLM could be given the following examples to learn the task of summarizing news articles:

  • “The president gave a speech…

--

--