Member-only story

Why Do Neural Network Weights Need to Be Randomly Initialized❓

Angelina Yang
1 min readJul 11, 2022

--

There are a lot of deep explanations regarding NN weights initialization elsewhere so here we’d like to share tips on what you can say during an interview setting.

Why do neural network weights need to be randomly initialized?

Source of image: Quora — Why is initializing weights by small numbers bad in neural networks?

Here are some example answers for readers’ reference:

The reason we use a random function in initializing weights is because if you don’t initialize your weights randomly, you will end up with some problem called the symmetry problem where every neuron is going to learn kind of the same thing. To avoid that, you will make the neuron start at different places and let them evolve independently from each other as much as possible.

Watch Dr.Kian Katanforoosh’s explanation:

Check the explanation!

Happy practicing!

Source of video: Stanford CS229 Lecture 12 (Autumn 2018) — Backprop & Improving Neural Networks by Dr. Kian Katanforoosh

--

--

No responses yet