What’s the Difference Between Attention and Self-attention in Transformer Models?

Angelina Yang
2 min readJul 19, 2022

“Attention” is one of the key ideas in the transformer architecture. There are a lot of deep explanations elsewhere so here we’d like to share tips on what you can say during an interview setting.

What’s the difference between attention and self-attention in transformer models?

Source of image: Medium — Attention is all you need by Vincent Mueller