4

[2004.12832] ColBERT: Efficient and Effective Passage Search via Contextualized...

 1 year ago
source link: https://arxiv.org/abs/2004.12832
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Computer Science > Information Retrieval

[Submitted on 27 Apr 2020 (v1), last revised 4 Jun 2020 (this version, v2)]

ColBERT: Efficient and Effective Passage Search via Contextualized Late Interaction over BERT

Download PDF

Recent progress in Natural Language Understanding (NLU) is driving fast-paced advances in Information Retrieval (IR), largely owed to fine-tuning deep language models (LMs) for document ranking. While remarkably effective, the ranking models based on these LMs increase computational cost by orders of magnitude over prior approaches, particularly as they must feed each query-document pair through a massive neural network to compute a single relevance score. To tackle this, we present ColBERT, a novel ranking model that adapts deep LMs (in particular, BERT) for efficient retrieval. ColBERT introduces a late interaction architecture that independently encodes the query and the document using BERT and then employs a cheap yet powerful interaction step that models their fine-grained similarity. By delaying and yet retaining this fine-granular interaction, ColBERT can leverage the expressiveness of deep LMs while simultaneously gaining the ability to pre-compute document representations offline, considerably speeding up query processing. Beyond reducing the cost of re-ranking the documents retrieved by a traditional model, ColBERT's pruning-friendly interaction mechanism enables leveraging vector-similarity indexes for end-to-end retrieval directly from a large document collection. We extensively evaluate ColBERT using two recent passage search datasets. Results show that ColBERT's effectiveness is competitive with existing BERT-based models (and outperforms every non-BERT baseline), while executing two orders-of-magnitude faster and requiring four orders-of-magnitude fewer FLOPs per query.

Comments: Accepted at SIGIR 2020
Subjects: Information Retrieval (cs.IR); Computation and Language (cs.CL)
Cite as: arXiv:2004.12832 [cs.IR]
  (or arXiv:2004.12832v2 [cs.IR] for this version)
  https://doi.org/10.48550/arXiv.2004.12832

Submission history

From: Omar Khattab [view email]
[v1] Mon, 27 Apr 2020 14:21:03 UTC (1,437 KB)
[v2] Thu, 4 Jun 2020 05:28:21 UTC (3,959 KB)

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK