What is Layer Normalization in Deep Learning?
There are a lot of explanations elsewhere, here I’d like to share some example questions in an interview setting.
What is layer normalization in deep learning?
Here are some tips for readers’ reference:
In batch normalization, input values of the same neuron for all the data in the mini-batch are normalized. We discussed batchnorm in this previous post.
Whereas in layer normalization, input values for all neurons in the same layer are normalized for each data sample.
Batch normalization works better with fully connected layers and convolutional neural network (CNN) but it shows poor results with recurrent neural network (RNN). On the other hand, the main advantage of Layer normalization is that it works really well with RNN.
AssemblyAI has a great video explaining this, including contrasting pros and cons of the two.
Let’s check it out!
Last week, I announced my free consulting sessions for “career in data science”. Thank you for those who…