Details
Original language | English |
---|---|
Title of host publication | CIKM 2020 - Proceedings of the 29th ACM International Conference on Information and Knowledge Management |
Publisher | Association for Computing Machinery (ACM) |
Pages | 2197-2200 |
Number of pages | 4 |
ISBN (electronic) | 9781450368599 |
Publication status | Published - Oct 2020 |
Event | 29th ACM International Conference on Information and Knowledge Management, CIKM 2020 - online, Virtual, Online, Ireland Duration: 19 Oct 2020 → 23 Oct 2020 |
Abstract
Recently introduced pre-trained contextualized autoregressive models like BERT have shown improvements in document retrieval tasks. One of the major limitations of the current approaches can be attributed to the manner they deal with variable-size document lengths using a fixed input BERT model. Common approaches either truncate or split longer documents into small sentences/passages and subsequently label them - using the original document label or from another externally trained model. The other problem is the scarcity of labelled query-document pairs that directly hampers the performance of modern data hungry neural models. This process gets even more complicated with the partially labelled large dataset of queries derived from query logs (TREC-DL). In this paper, we handle both the issues simultaneously and introduce passage level weak supervision in contrast to standard document level supervision. We conduct a preliminary study on the document to passage label transfer and influence of unlabelled documents on the performance of adhoc document retrieval. We observe that direct transfer of relevance labels from documents to passages introduces label noise that strongly affects retrieval effectiveness. We propose a weak-supervision based transfer passage labelling scheme that helps in performance improvement and gathering relevant passages from unlabelled documents.
Keywords
- adhoc retrieval, distant supervision, document ranking
ASJC Scopus subject areas
- Business, Management and Accounting(all)
- General Business,Management and Accounting
- Decision Sciences(all)
- General Decision Sciences
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
CIKM 2020 - Proceedings of the 29th ACM International Conference on Information and Knowledge Management. Association for Computing Machinery (ACM), 2020. p. 2197-2200.
Research output: Chapter in book/report/conference proceeding › Conference contribution › Research › peer review
}
TY - GEN
T1 - Distant Supervision in BERT-based Adhoc Document Retrieval
AU - Rudra, Koustav
AU - Anand, Avishek
N1 - Funding information: Acknowledgement: Funding for this project was in part provided by the European Union’s Horizon 2020 research and innovation programme under grant agreement No 832921.
PY - 2020/10
Y1 - 2020/10
N2 - Recently introduced pre-trained contextualized autoregressive models like BERT have shown improvements in document retrieval tasks. One of the major limitations of the current approaches can be attributed to the manner they deal with variable-size document lengths using a fixed input BERT model. Common approaches either truncate or split longer documents into small sentences/passages and subsequently label them - using the original document label or from another externally trained model. The other problem is the scarcity of labelled query-document pairs that directly hampers the performance of modern data hungry neural models. This process gets even more complicated with the partially labelled large dataset of queries derived from query logs (TREC-DL). In this paper, we handle both the issues simultaneously and introduce passage level weak supervision in contrast to standard document level supervision. We conduct a preliminary study on the document to passage label transfer and influence of unlabelled documents on the performance of adhoc document retrieval. We observe that direct transfer of relevance labels from documents to passages introduces label noise that strongly affects retrieval effectiveness. We propose a weak-supervision based transfer passage labelling scheme that helps in performance improvement and gathering relevant passages from unlabelled documents.
AB - Recently introduced pre-trained contextualized autoregressive models like BERT have shown improvements in document retrieval tasks. One of the major limitations of the current approaches can be attributed to the manner they deal with variable-size document lengths using a fixed input BERT model. Common approaches either truncate or split longer documents into small sentences/passages and subsequently label them - using the original document label or from another externally trained model. The other problem is the scarcity of labelled query-document pairs that directly hampers the performance of modern data hungry neural models. This process gets even more complicated with the partially labelled large dataset of queries derived from query logs (TREC-DL). In this paper, we handle both the issues simultaneously and introduce passage level weak supervision in contrast to standard document level supervision. We conduct a preliminary study on the document to passage label transfer and influence of unlabelled documents on the performance of adhoc document retrieval. We observe that direct transfer of relevance labels from documents to passages introduces label noise that strongly affects retrieval effectiveness. We propose a weak-supervision based transfer passage labelling scheme that helps in performance improvement and gathering relevant passages from unlabelled documents.
KW - adhoc retrieval
KW - distant supervision
KW - document ranking
UR - http://www.scopus.com/inward/record.url?scp=85095866363&partnerID=8YFLogxK
U2 - 10.1145/3340531.3412124
DO - 10.1145/3340531.3412124
M3 - Conference contribution
AN - SCOPUS:85095866363
SP - 2197
EP - 2200
BT - CIKM 2020 - Proceedings of the 29th ACM International Conference on Information and Knowledge Management
PB - Association for Computing Machinery (ACM)
T2 - 29th ACM International Conference on Information and Knowledge Management, CIKM 2020
Y2 - 19 October 2020 through 23 October 2020
ER -