Is “Perplexity” Bigger the Better or Smaller the Better?

Angelina Yang
2 min readJul 5, 2022

“Perplexity” is the standard evaluation metric for language models. There are a lot of deep explanations elsewhere so here we’d like to share tips on what you can say during an interview setting.

What is “Perplexity”?

It is defined as the inverse probability of the corpus according to the language model. For every word “Xt”, in the corpus, we compute the probability of that word given everything that came so far but its inverse is one over that. And then, we normalize this big product by the number of words. The reason for normalizing it is that if we don’t, the perplexity would just get larger and larger as your corpus got bigger.

Source of Image: Stanford CS224N Lecture 6

Note: In the industry, language models are increasingly being used as precursor models for downstream NLP tasks. Therefore, they are often also evaluated based on how well they perform on these downstream tasks.

Watch Dr. Abby See’s explanation!

Is “Perplexity” bigger the better or smaller the better?

For the “perplexity” metric, the smaller it is the better. Because it is the inverse probability of the corpus. if you want your language model to assign high probability to the corpus, that means you want low perplexity.

Check out the explanation!

Source of video: Stanford CS224N Lecture 6 (Winter 2019) — Lecture 6 — Language Models and RNNs by Dr. Abby See

Good reads: Two minutes NLP — Perplexity explained with simple probabilities by Fabio Chiusano

--

--