https://arxiv.org/abs/1703.02065v2 On the Expressive Power of Overlapping Operations of Deep Networks

In this paper we extend the study of expressive efficiency to the attribute of network connectivity and in particular to the effect of “overlaps” in the convolutional process, i.e., when the stride of the convolution is smaller than its kernel size (receptive field). Our analysis shows that having overlapping local receptive fields, and more broadly denser connectivity, results in an exponential increase in the expressive capacity of neural networks. Moreover, while denser connectivity can increase the expressive capacity, we show that the most common types of modern architectures already exhibit exponential increase in expressivity, without relying on fully-connected layers.

https://arxiv.org/pdf/1703.07928.pdf Guided Perturbations: Self Corrective Behavior in Convolutional Neural Networks

In a CNN, the receptive fields of neighboring pixels define a context for their interactions. The advantage of having overlapping receptive fields is that the neighborhood connectivity is established automatically without explicitly specifying it.