Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
information_retrieval [2017/11/27 15:08]
admin
information_retrieval [2018/10/29 20:30]
admin
Line 42: Line 42:
 https://​arxiv.org/​pdf/​1711.08726.pdf Modelling Domain Relationships for Transfer Learning on https://​arxiv.org/​pdf/​1711.08726.pdf Modelling Domain Relationships for Transfer Learning on
 Retrieval-based Question Answering Systems in E-commerce Retrieval-based Question Answering Systems in E-commerce
 +
 +https://​arxiv.org/​pdf/​1705.01509.pdf Neural Models for Information Retrieval
 +
 +https://​arxiv.org/​abs/​1807.02299 On the Equilibrium of Query Reformulation and Document Retrieval
 +
 +https://​github.com/​hamed-zamani/​snrm Standalone Neural Ranking Model (SNRM)
 +
 +https://​github.com/​williamleif/​graphqembed Embedding Logical Queries on Knowledge Graphs https://​arxiv.org/​pdf/​1806.01445.pdf
 +
 +Here we introduce a framework to efficiently make predictions about conjunctive
 +logical queries—a flexible but tractable subset of first-order logic—on incomplete
 +knowledge graphs. In our approach, we embed graph nodes in a low-dimensional
 +space and represent logical operators as learned geometric operations (e.g., translation,​
 +rotation) in this embedding space. By performing logical operations within a
 +low-dimensional embedding space, our approach achieves a time complexity that
 +is linear in the number of query variables, compared to the exponential complexity
 +required by a naive enumeration-based approach. We demonstrate the utility of
 +this framework in two application studies on real-world datasets with millions
 +of relations: predicting logical relationships in a network of drug-gene-disease
 +interactions and in a graph-based representation of social interactions derived from
 +a popular web forum.
 +
 +https://​arxiv.org/​abs/​1809.05679v1 Graph Convolutional Networks for Text Classification
 +
 +We build a single text graph for a corpus based on word co-occurrence and document word relations, then learn a Text Graph Convolutional Network (Text GCN) for the corpus. Our Text GCN is initialized with one-hot representation for word and document, it then jointly learns the embeddings for both words and documents, as supervised by the known class labels for documents. ​
 +
 +https://​arxiv.org/​abs/​1803.01707v2 Neural Architectures for Open-Type Relation Argument Extraction
 +
 +https://​arxiv.org/​abs/​1810.09591 Applying Deep Learning To Airbnb Search
 +
 +The application to search ranking is one of the biggest machine learning success stories at Airbnb. Much of the initial gains were driven by a gradient boosted decision tree model. The gains, however, plateaued over time. This paper discusses the work done in applying neural networks in an attempt to break out of that plateau. We present our perspective not with the intention of pushing the frontier of new modeling techniques. Instead, ours is a story of the elements we found useful in applying neural networks to a real life product. Deep learning was steep learning for us. To other teams embarking on similar journeys, we hope an account of our struggles and triumphs will provide some useful pointers. Bon voyage!