Convolution LSTM Network
https://openreview.net/pdf?id=H1zJ-v5xl QUASI-RECURRENT NEURAL NETWORKS
We introduce quasi-recurrent neural networks (QRNNs), an approach to neural sequence modeling that alternates convolutional layers, which apply in parallel across timesteps, and a minimalist recurrent pooling function that applies in parallel across channels. Despite lacking trainable recurrent layers, stacked QRNNs have better predictive accuracy than stacked LSTMs of the same hidden size. Due to their increased parallelism, they are up to 16 times faster at train and test time.
http://arxiv.org/pdf/1606.05262v1.pdf Convolutional Residual Memory Networks