Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
temporal_learning [2018/01/30 23:07]
admin
temporal_learning [2018/11/12 22:25] (current)
admin
Line 241: Line 241:
 https://​arxiv.org/​pdf/​1707.01926.pdf Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting https://​arxiv.org/​pdf/​1707.01926.pdf Diffusion Convolutional Recurrent Neural Network: Data-Driven Traffic Forecasting
  
-The task is challenging due to (1) complex spatial dependency on road networks, (2) non-linear temporal dynamics with changing road conditions and (3) inherent difficulty of long-term forecasting. To address these challenges, we propose to model the traffic flow as a diffusion process on a directed graph and introduce Diffusion Convolutional Recurrent Neural Network (DCRNN), a deep learning framework for traffic forecasting that incorporates both spatial and temporal dependency in the traffic flow. +The task is challenging due to (1) complex spatial dependency on road networks, (2) non-linear temporal dynamics with changing road conditions and (3) inherent difficulty of long-term forecasting. To address these challenges, we propose to model the traffic flow as a diffusion process on a directed graph and introduce Diffusion Convolutional Recurrent Neural Network (DCRNN), a deep learning framework for traffic forecasting that incorporates both spatial and temporal dependency in the traffic flow
 + 
 +https://​arxiv.org/​abs/​1702.04649 https://​arxiv.org/​abs/​1702.04649 
 + 
 +We consider the general problem of modeling temporal data with long-range dependencies,​ wherein new observations are fully or partially predictable based on temporally-distant,​ past observations. A sufficiently powerful temporal model should separate predictable elements of the sequence from unpredictable elements, express uncertainty about those unpredictable elements, and rapidly identify novel elements that may help to predict the future. To create such models, we introduce Generative Temporal Models augmented with external memory systems. They are developed within the variational inference framework, which provides both a practical training methodology and methods to gain insight into the models'​ operation. We show, on a range of problems with sparse, long-term temporal dependencies,​ that these models store information from early in a sequence, and reuse this stored information efficiently. This allows them to perform substantially better than existing models based on well-known recurrent neural networks, like LSTMs.  
 + 
 +https://​medium.com/​the-artificial-impostor/​notes-understanding-tensorflow-part-3-7f6633fcc7c7 
 + 
 +https://​www.arxiv-vanity.com/​papers/​1803.01271/​ An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling 
 + 
 +https://​arxiv.org/​abs/​1703.06846v3 Boosting Dilated Convolutional Networks with Mixed Tensor Decompositions 
 + 
 +https://​arxiv.org/​abs/​1808.04063 Time Perception Machine: Temporal Point Processes for the When, Where and What of Activity Prediction 
 + 
 + We propose an integrated framework of neural networks and temporal point processes for predicting when the next activity will happen. Because point processes are limited to taking event frames as input, we propose a simple yet effective mechanism to extract features at frames of interest while also preserving the rich information in the remaining frames. We evaluate our model on two challenging datasets. The results show that our model outperforms traditional statistical point process approaches significantly,​ demonstrating its effectiveness in capturing the underlying temporal dynamics as well as the correlation within sequential activities. Furthermore,​ we also extend our model to a joint estimation framework for predicting the timing, spatial location, and category of the activity simultaneously,​ to answer the when, where, and what of activity prediction. 
 + 
 +https://​arxiv.org/​abs/​1802.04687 Neural Relational Inference for Interacting Systems 
 + 
 +https://​arxiv.org/​abs/​1808.10594 Proximity Forest: An effective and scalable distance-based classifier for time series 
 + 
 +We demonstrate on a 1M time series Earth observation dataset that Proximity Forest retains this accuracy on datasets that are many orders of magnitude greater than those in the UCR repository, while learning its models at least 100,000 times faster than current state of the art models Elastic Ensemble and COTE. 
 + 
 +https://​arxiv.org/​pdf/​1809.04423.pdf https://​github.com/​codeaudit/​neuronal_circuit_policies Re-purposing Compact Neuronal Circuit Policies to Govern Reinforcement 
 +Learning Tasks 
 + 
 +https://​openreview.net/​forum?​id=BJl_VnR9Km A model cortical network for spatiotemporal sequence learning and prediction 
 + 
 +A new hierarchical cortical model for encoding spatiotemporal memory and video prediction 
 + 
 +The architecture includes feedforward,​ feedback, and local recurrent connections,​ which together implement a predictive coding scheme. Some versions of the network are shown to outperform the similar PredNet and PredRNN architectures on two video prediction tasks: moving MNIST and KTH human actions.