Answer Posted / Manoj Singh
Batch normalization is a technique used in deep learning to stabilize the training process and improve the generalization of models. It normalizes the activations of each layer across a minibatch, which helps to reduce internal covariate shift and speed up convergence.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers