Data

Code Repository for "Efficient Cost-Aware Cascade Ranking in Multi-Stage Retrieval"

RMIT University, Australia
Professor Shane Culpepper (Aggregated by) Shane Culpepper (Aggregated by)
Viewed: [[ro.stat.viewed]] Cited: [[ro.stat.cited]] Accessed: [[ro.stat.accessed]]
ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Adc&rfr_id=info%3Asid%2FANDS&rft_id=https://github.com/rmit-ir/LTR_Cascade&rft.title=Code Repository for "Efficient Cost-Aware Cascade Ranking in Multi-Stage Retrieval"&rft.identifier=971df393dd744e8ac446f9104a61d68a&rft.publisher=RMIT University, Australia&rft.description=This repository is home to a reference implementation of the cascade ranking model in the SIGIR '17 paper Efficient Cost-Aware Cascade Ranking for Multi-Stage Retrieval. Complex machine learning models are now an integral part of modern, large-scale retrieval systems. However, collection size growth continues to outpace advances in efficiency improvements in the learning models which achieve the highest effectiveness. In this paper, we re-examine the importance of tightly integrating feature costs into multi-stage learning-to-rank (LTR) IR systems. We present a novel approach to optimizing cascaded ranking models which can directly leverage a variety of different state-of-the-art LTR rankers such as LambdaMART and Gradient Boosted Decision Trees. Using our cascade model, we conclusively show that feature costs and the number of documents being re-ranked in each stage of the cascade can be balanced to maximize both efficiency and effectiveness. Finally, we also demonstrate that our cascade model can easily be deployed on commonly used collections to achieve state-of-the-art effectiveness results while only using a subset of the features required by the full model.&rft.creator=Professor Shane Culpepper&rft.creator=Shane Culpepper&rft.date=2018&rft.relation=http://dx.doi.org/10.1145/3077136.3080819&rft_rights=All rights reserved&rft_rights=CC BY-NC: Attribution-Noncommercial 3.0 AU http://creativecommons.org/licenses/by-nc/3.0/au&rft_subject=Learning-to-Rank&rft_subject=Cascade Ranking&rft_subject=Information Retrieval&rft_subject=Information Retrieval and Web Search&rft_subject=INFORMATION AND COMPUTING SCIENCES&rft_subject=LIBRARY AND INFORMATION STUDIES&rft.type=dataset&rft.language=English Access the data

Licence & Rights:

Other view details
Unknown

CC BY-NC: Attribution-Noncommercial 3.0 AU
http://creativecommons.org/licenses/by-nc/3.0/au

All rights reserved

Access:

Other view details

Data available in link. For any queries about this or any other RMIT dataset, please contact research.data@rmit.edu.au

Contact Information


GitHub

Full description

This repository is home to a reference implementation of the cascade ranking model in the SIGIR '17 paper "Efficient Cost-Aware Cascade Ranking for Multi-Stage Retrieval". Complex machine learning models are now an integral part of modern, large-scale retrieval systems. However, collection size growth continues to outpace advances in efficiency improvements in the learning models which achieve the highest effectiveness. In this paper, we re-examine the importance of tightly integrating feature costs into multi-stage learning-to-rank (LTR) IR systems. We present a novel approach to optimizing cascaded ranking models which can directly leverage a variety of different state-of-the-art LTR rankers such as LambdaMART and Gradient Boosted Decision Trees. Using our cascade model, we conclusively show that feature costs and the number of documents being re-ranked in each stage of the cascade can be balanced to maximize both efficiency and effectiveness. Finally, we also demonstrate that our cascade model can easily be deployed on commonly used collections to achieve state-of-the-art effectiveness results while only using a subset of the features required by the full model.

This dataset is part of a larger collection

Click to explore relationships graph
Subjects

User Contributed Tags    

Login to tag this record with meaningful keywords to make it easier to discover

Identifiers
  • Local : 971df393dd744e8ac446f9104a61d68a