Details
Originalsprache | Englisch |
---|---|
Titel des Sammelwerks | CIKM 2020 - Proceedings of the 29th ACM International Conference on Information and Knowledge Management |
Herausgeber (Verlag) | Association for Computing Machinery (ACM) |
Seiten | 2197-2200 |
Seitenumfang | 4 |
ISBN (elektronisch) | 9781450368599 |
Publikationsstatus | Veröffentlicht - Okt. 2020 |
Veranstaltung | 29th ACM International Conference on Information and Knowledge Management - online, Virtual, Online, Irland Dauer: 19 Okt. 2020 → 23 Okt. 2020 |
Abstract
Recently introduced pre-trained contextualized autoregressive models like BERT have shown improvements in document retrieval tasks. One of the major limitations of the current approaches can be attributed to the manner they deal with variable-size document lengths using a fixed input BERT model. Common approaches either truncate or split longer documents into small sentences/passages and subsequently label them - using the original document label or from another externally trained model. The other problem is the scarcity of labelled query-document pairs that directly hampers the performance of modern data hungry neural models. This process gets even more complicated with the partially labelled large dataset of queries derived from query logs (TREC-DL). In this paper, we handle both the issues simultaneously and introduce passage level weak supervision in contrast to standard document level supervision. We conduct a preliminary study on the document to passage label transfer and influence of unlabelled documents on the performance of adhoc document retrieval. We observe that direct transfer of relevance labels from documents to passages introduces label noise that strongly affects retrieval effectiveness. We propose a weak-supervision based transfer passage labelling scheme that helps in performance improvement and gathering relevant passages from unlabelled documents.
ASJC Scopus Sachgebiete
- Betriebswirtschaft, Management und Rechnungswesen (insg.)
- Allgemeine Unternehmensführung und Buchhaltung
- Entscheidungswissenschaften (insg.)
- Allgemeine Entscheidungswissenschaften
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
CIKM 2020 - Proceedings of the 29th ACM International Conference on Information and Knowledge Management. Association for Computing Machinery (ACM), 2020. S. 2197-2200.
Publikation: Beitrag in Buch/Bericht/Sammelwerk/Konferenzband › Aufsatz in Konferenzband › Forschung › Peer-Review
}
TY - GEN
T1 - Distant Supervision in BERT-based Adhoc Document Retrieval
AU - Rudra, Koustav
AU - Anand, Avishek
N1 - Funding information: Acknowledgement: Funding for this project was in part provided by the European Union’s Horizon 2020 research and innovation programme under grant agreement No 832921.
PY - 2020/10
Y1 - 2020/10
N2 - Recently introduced pre-trained contextualized autoregressive models like BERT have shown improvements in document retrieval tasks. One of the major limitations of the current approaches can be attributed to the manner they deal with variable-size document lengths using a fixed input BERT model. Common approaches either truncate or split longer documents into small sentences/passages and subsequently label them - using the original document label or from another externally trained model. The other problem is the scarcity of labelled query-document pairs that directly hampers the performance of modern data hungry neural models. This process gets even more complicated with the partially labelled large dataset of queries derived from query logs (TREC-DL). In this paper, we handle both the issues simultaneously and introduce passage level weak supervision in contrast to standard document level supervision. We conduct a preliminary study on the document to passage label transfer and influence of unlabelled documents on the performance of adhoc document retrieval. We observe that direct transfer of relevance labels from documents to passages introduces label noise that strongly affects retrieval effectiveness. We propose a weak-supervision based transfer passage labelling scheme that helps in performance improvement and gathering relevant passages from unlabelled documents.
AB - Recently introduced pre-trained contextualized autoregressive models like BERT have shown improvements in document retrieval tasks. One of the major limitations of the current approaches can be attributed to the manner they deal with variable-size document lengths using a fixed input BERT model. Common approaches either truncate or split longer documents into small sentences/passages and subsequently label them - using the original document label or from another externally trained model. The other problem is the scarcity of labelled query-document pairs that directly hampers the performance of modern data hungry neural models. This process gets even more complicated with the partially labelled large dataset of queries derived from query logs (TREC-DL). In this paper, we handle both the issues simultaneously and introduce passage level weak supervision in contrast to standard document level supervision. We conduct a preliminary study on the document to passage label transfer and influence of unlabelled documents on the performance of adhoc document retrieval. We observe that direct transfer of relevance labels from documents to passages introduces label noise that strongly affects retrieval effectiveness. We propose a weak-supervision based transfer passage labelling scheme that helps in performance improvement and gathering relevant passages from unlabelled documents.
KW - adhoc retrieval
KW - distant supervision
KW - document ranking
UR - http://www.scopus.com/inward/record.url?scp=85095866363&partnerID=8YFLogxK
U2 - 10.1145/3340531.3412124
DO - 10.1145/3340531.3412124
M3 - Conference contribution
AN - SCOPUS:85095866363
SP - 2197
EP - 2200
BT - CIKM 2020 - Proceedings of the 29th ACM International Conference on Information and Knowledge Management
PB - Association for Computing Machinery (ACM)
T2 - 29th ACM International Conference on Information and Knowledge Management, CIKM 2020
Y2 - 19 October 2020 through 23 October 2020
ER -