← smac.pub home

OpenNIR: A Complete Neural Ad-Hoc Ranking Pipeline

pdf bibtex code poster 50 citations demonstration paper

Authors: Sean MacAvaney

Appeared in: Proceedings of the Thirteenth ACM International Conference on Web Search and Data Mining (WSDM 2020)

Links/IDs:
DOI 10.1145/3336191.3371864 DBLP conf/wsdm/MacAvaney20 Google Scholar 7wWfoDgAAAAJ:0EnyYjriUFMC Semantic Scholar 7a323447c702ab3d94b28634ffe5d488f49059dd smac.pub wsdm2020-onir

Abstract:

With the growing popularity of neural approaches for ad-hoc ranking, there is a need for tools that can effectively reproduce prior results and ease continued research by supporting current state-of-the-art approaches. Although several excellent neural ranking tools exist, none offer an easy end-to-end ad-hoc neural raking pipeline. A complete pipeline is particularly important for ad-hoc ranking because there are numerous parameter settings that have a considerable effect on the ultimate performance yet often are under-reported in current work (e.g., initial ranking settings, re-ranking threshold, training sampling strategy, etc.). In this work, I present OpenNIR, a complete ad-hoc neural ranking pipeline, which addresses these shortcomings. The pipeline is easy to use (a single command will download required data, train, and evaluate a model), yet highly configurable, allowing for continued work in areas that are understudied. Aside from the core pipeline, the software also includes several bells and whistles that make use of components of the pipeline, such as performance benchmarking and tuning of unsupervised ranker parameters for fair comparisons against traditional baselines. The pipeline and these capabilities are demonstrated. The code is available, and contributions are welcome.

BibTeX @inproceedings{macavaney:wsdm2020-onir, author = {MacAvaney, Sean}, title = {OpenNIR: A Complete Neural Ad-Hoc Ranking Pipeline}, booktitle = {Proceedings of the Thirteenth ACM International Conference on Web Search and Data Mining}, year = {2020}, doi = {10.1145/3336191.3371864}, pages = {845--848} }