Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
passthrough [2017/08/30 02:56]
127.0.0.1 external edit
passthrough [2018/07/24 11:36]
admin
Line 177: Line 177:
  
 We propose a deep learning model inspired by neocortical communication via the thalamus. Our model consists of recurrent neural modules that send features via a routing center, endowing the modules with the flexibility to share features over multiple time steps. We show that our model learns to route information hierarchically,​ processing input data by a chain of modules. We observe common architectures,​ such as feed forward neural networks and skip connections,​ emerging as special cases of our architecture,​ while novel connectivity patterns are learned for the text8 compression task. We demonstrate that our model outperforms standard recurrent neural networks on three sequential benchmarks. We propose a deep learning model inspired by neocortical communication via the thalamus. Our model consists of recurrent neural modules that send features via a routing center, endowing the modules with the flexibility to share features over multiple time steps. We show that our model learns to route information hierarchically,​ processing input data by a chain of modules. We observe common architectures,​ such as feed forward neural networks and skip connections,​ emerging as special cases of our architecture,​ while novel connectivity patterns are learned for the text8 compression task. We demonstrate that our model outperforms standard recurrent neural networks on three sequential benchmarks.
 +
 +https://​arxiv.org/​abs/​1611.04849 Deeply supervised salient object detection with short connections
 +
 + In this paper, we propose a new method for saliency detection by introducing short connections to the skip-layer structures within the HED architecture. Our framework provides rich multi-scale feature maps at each layer, a property that is critically needed to perform segment detection. ​
 +
 +https://​arxiv.org/​abs/​1801.05895 Sparsely Connected Convolutional Networks
 +
 +Based on our analysis, we propose a new structure named SparseNets which achieves better performance with fewer parameters than DenseNets and ResNets.
 +
 +https://​arxiv.org/​abs/​1807.04863v1 Avoiding Latent Variable Collapse With Generative Skip Models