Batch Normalization
Outside the Norm
Batch normalization is a technique that normalizes layer outputs to accelerate neural network training. But new research shows that it has other effects that may be more important.
1 Post
Stay updated with weekly AI News and Insights delivered to your inbox