pdf bibtex code 8 citations short conference paper
Appeared in: Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2023)
Abstract:
Learned sparse retrieval (LSR) is a family of neural retrieval methods that transform queries and documents into sparse weight vectors aligned with a vocabulary. While LSR approaches like Splade work well for short passages, it is unclear how well they handle longer documents. We investigate existing aggregation approaches for adapting LSR to longer documents and find that proximal scoring is crucial for LSR to handle long documents. To leverage this property, we proposed two adaptations of the Sequential Dependence Model (SDM) to LSR: ExactSDM and SoftSDM. ExactSDM assumes only exact query term dependence, while SoftSDM uses potential functions that model the dependence of query terms and their expansion terms (identified using a transformer's masked language modeling head). Our experiments on the MSMARCO Document and TREC Robust04 datasets demonstrate that both ExactSDM and SoftSDM outperform existing LSR aggregation approaches for different document length constraints. Surprisingly, SoftSDM does not provide any performance benefits over ExactSDM. This suggests that soft proximity matching is not necessary for modeling term dependence in LSR, contrary to prior work that found soft unigram matching to improve performance. Overall, this study provides insights into adapting LSR to handle longer documents, proposing effective adaptations that improve its performance. We will release our code and experimental setup upon publication for reproducing our results.
BibTeX @inproceedings{nguyen:sigir2023-llsr, author = {Nguyen, Thống and MacAvaney, Sean and Yates, Andrew}, title = {Adapting Learned Sparse Retrieval for Long Documents}, booktitle = {Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval}, year = {2023}, url = {https://arxiv.org/abs/2305.18494}, doi = {10.1145/3539618.3591943} }