This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
hierarchical_abstraction [2018/04/15 14:02]
hierarchical_abstraction [2018/11/18 18:57]
Line 486: Line 486:
 that gradually simplify the task dynamics. that gradually simplify the task dynamics.
 +https://​openreview.net/​forum?​id=S1JHhv6TW Boosting Dilated Convolutional Networks with Mixed Tensor Decompositions ​
 +We focus on dilated convolutional networks, a family of deep models delivering state of the art performance in sequence processing tasks. By introducing and analyzing the concept of mixed tensor decompositions,​ we prove that interconnecting dilated convolutional networks can lead to expressive efficiency. In particular, we show that even a single connection between intermediate layers can already lead to an almost quadratic gap, which in large-scale settings typically makes the difference between a model that is practical and one that is not
 +https://​arxiv.org/​abs/​1807.04640v1 Automatically Composing Representation Transformations as a Means for Generalization
 +https://​arxiv.org/​abs/​1807.07560v1 Compositional GAN: Learning Conditional Image Composition
 +https://​arxiv.org/​pdf/​1803.00590.pdf Hierarchical Imitation and Reinforcement Learning
 +We propose an algorithmic framework, called hierarchical
 +guidance, that leverages the hierarchical
 +structure of the underlying problem to integrate
 +different modes of expert interaction. Our
 +framework can incorporate different combinations
 +of imitation learning (IL) and reinforcement
 +learning (RL) at different levels, leading to dramatic
 +reductions in both expert effort and cost of
 +https://​arxiv.org/​pdf/​1807.03748.pdf Representation Learning with
 +Contrastive Predictive Coding