This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
summarization [2018/02/03 11:11]
summarization [2018/11/11 10:28] (current)
Line 47: Line 47:
 https://​arxiv.org/​abs/​1801.10198 Generating Wikipedia by Summarizing Long Sequences https://​arxiv.org/​abs/​1801.10198 Generating Wikipedia by Summarizing Long Sequences
 +https://​arxiv.org/​abs/​1804.05685v1 A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents
 +https://​arxiv.org/​abs/​1808.10792 Bottom-Up Abstractive Summarization
 +https://​arxiv.org/​abs/​1810.05739 Unsupervised Neural Multi-document Abstractive Summarization
 +https://​arxiv.org/​pdf/​1811.01824.pdf STRUCTURED NEURAL SUMMARIZATION
 + Based
 +on the promising results of graph neural networks on highly structured data, we develop
 +a framework to extend existing sequence encoders with a graph component
 +that can reason about long-distance relationships in weakly structured data such as
 +text. In an extensive evaluation, we show that the resulting hybrid sequence-graph
 +models outperform both pure sequence models as well as pure graph models on a
 +range of summarization tasks.
 +We presented a framework for extending sequence encoders with a graph component that can leverage
 +rich additional structure. In an evaluation on three different summarization tasks, we have shown
 +that this augmentation improves the performance of a range of different sequence models across all