What is Layer Normalization in Deep Learning?

Angelina Yang
2 min readMar 21, 2023

There are a lot of explanations elsewhere, here I’d like to share some example questions in an interview setting.

What is layer normalization in deep learning?

Source : The illustration of layer normalization (left) and batch/power normalization (right). The entries colored in blue show the components used for calculating the statistics.

Here are some tips for readers’ reference:

In batch normalization, input values of the same neuron for all the data in the mini-batch are normalized. We discussed batchnorm in this previous post.

Whereas in layer normalization, input values for all neurons in the same layer are normalized for each data sample.

Batch normalization works better with fully connected layers and convolutional neural network (CNN) but it shows poor results with recurrent neural network (RNN). On the other hand, the main advantage of Layer normalization is that it works really well with RNN.

AssemblyAI has a great video explaining this, including contrasting pros and cons of the two.

Let’s check it out!

Check the explanation!

Last week, I announced my free consulting sessions for “career in data science”. Thank you for those who

--

--

No responses yet