While modern convolutional neural networks achieve outstanding accuracy on many image classification tasks, they are, compared to humans, much more sensitive to image degradation. Here, we describe a variant of Batch Normalization, LocalNorm, that regularizes the normalization layer in the spirit of Dropout while dynamically adapting to the local image intensity and contrast at test-time. We show that the resulting deep neural networks are much more resistant to noise-induced image degradation, improving accuracy by up to three times, while achieving the same or slightly better accuracy on non-degraded classical benchmarks. In computational terms, LocalNorm adds negligible training cost and little or no cost at inference time, and can be applied to already-trained networks in a straightforward manner.

doi.org/10.1007/978-3-030-86380-7_20
Lecture Notes in Computer Science
Efficient Deep Learning Platforms
30th International Conference on Artificial Neural Networks, ICANN 2021
Machine Learning

Yin, B., Scholte, S., & Bohte, S. (2021). LocalNorm: Robust image classification through dynamically regularized normalization. In Proceedings of the International Conference of Artificial Neural Networks (pp. 240–252). doi:10.1007/978-3-030-86380-7_20