A Refresher on Softmax!
There are a lot of deep explanations elsewhere so here I’d like to share some example questions in an interview setting.
What is the softmax activation for the output layer of a neural network? And, what is the effect of softmax?
Here are some example answers for readers’ reference:
The softmax function converts a vector of K real numbers into a probability distribution of K possible outcomes. It is a generalization of the logistic function to multiple dimensions, and used in multinomial logistic regression. The softmax function is often used as the last activation function of a neural network to normalize the output of a network to a probability distribution over predicted output classes.
Softmax produces the probability distribution of each predicted class. The class with the maximum probability will be the prediction.