QuanTemp: A real-world open-domain benchmark for fact-checking numerical claims

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

  • V. Venktesh
  • Abhijit Anand
  • Avishek Anand
  • Vinay Setty

Research Organisations

External Research Organisations

  • Delft University of Technology
  • University of Stavanger
View graph of relations

Details

Original languageEnglish
Title of host publicationProceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval
Pages650-660
Number of pages11
ISBN (electronic)9798400704314
Publication statusPublished - 11 Jul 2024
Event47th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2024 - Washington, United States
Duration: 14 Jul 202418 Jul 2024

Abstract

With the growth of misinformation on the web, automated fact checking has garnered immense interest for detecting growing misinformation and disinformation. Current systems have made significant advancements in handling synthetic claims sourced from Wikipedia, and noteworthy progress has been achieved in addressing real-world claims that are verified by fact-checking organizations as well. We compile and release QuanTemp, a diverse, multi-domain dataset focused exclusively on numerical claims, encompassing comparative, statistical, interval, and temporal aspects, with detailed metadata and an accompanying evidence collection. This addresses the challenge of verifying real-world numerical claims, which are complex and often lack precise information, a gap not filled by existing works that mainly focus on synthetic claims. We evaluate and quantify these gaps in existing solutions for the task of verifying numerical claims. We also evaluate claim decomposition based methods, numerical understanding based natural language inference (NLI) models and our best baselines achieves a macro-F1 of 58.32. This demonstrates that QuanTemp serves as a challenging evaluation set for numerical claim verification.

Keywords

    claim decomposition, fact-checking, numerical claims

ASJC Scopus subject areas

Cite this

QuanTemp: A real-world open-domain benchmark for fact-checking numerical claims. / Venktesh, V.; Anand, Abhijit; Anand, Avishek et al.
Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval. 2024. p. 650-660.

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Venktesh, V, Anand, A, Anand, A & Setty, V 2024, QuanTemp: A real-world open-domain benchmark for fact-checking numerical claims. in Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval. pp. 650-660, 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2024, Washington, United States, 14 Jul 2024. https://doi.org/10.48550/arXiv.2403.1716, https://doi.org/10.1145/3626772.3657874
Venktesh, V., Anand, A., Anand, A., & Setty, V. (2024). QuanTemp: A real-world open-domain benchmark for fact-checking numerical claims. In Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 650-660) https://doi.org/10.48550/arXiv.2403.1716, https://doi.org/10.1145/3626772.3657874
Venktesh V, Anand A, Anand A, Setty V. QuanTemp: A real-world open-domain benchmark for fact-checking numerical claims. In Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval. 2024. p. 650-660 doi: 10.48550/arXiv.2403.1716, 10.1145/3626772.3657874
Venktesh, V. ; Anand, Abhijit ; Anand, Avishek et al. / QuanTemp : A real-world open-domain benchmark for fact-checking numerical claims. Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval. 2024. pp. 650-660
Download
@inproceedings{2bf0a3ada17543108d76a4f672da8682,
title = "QuanTemp: A real-world open-domain benchmark for fact-checking numerical claims",
abstract = "With the growth of misinformation on the web, automated fact checking has garnered immense interest for detecting growing misinformation and disinformation. Current systems have made significant advancements in handling synthetic claims sourced from Wikipedia, and noteworthy progress has been achieved in addressing real-world claims that are verified by fact-checking organizations as well. We compile and release QuanTemp, a diverse, multi-domain dataset focused exclusively on numerical claims, encompassing comparative, statistical, interval, and temporal aspects, with detailed metadata and an accompanying evidence collection. This addresses the challenge of verifying real-world numerical claims, which are complex and often lack precise information, a gap not filled by existing works that mainly focus on synthetic claims. We evaluate and quantify these gaps in existing solutions for the task of verifying numerical claims. We also evaluate claim decomposition based methods, numerical understanding based natural language inference (NLI) models and our best baselines achieves a macro-F1 of 58.32. This demonstrates that QuanTemp serves as a challenging evaluation set for numerical claim verification.",
keywords = "claim decomposition, fact-checking, numerical claims",
author = "V. Venktesh and Abhijit Anand and Avishek Anand and Vinay Setty",
note = "Publisher Copyright: {\textcopyright} 2024 Owner/Author.; 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2024 ; Conference date: 14-07-2024 Through 18-07-2024",
year = "2024",
month = jul,
day = "11",
doi = "10.48550/arXiv.2403.1716",
language = "English",
pages = "650--660",
booktitle = "Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval",

}

Download

TY - GEN

T1 - QuanTemp

T2 - 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2024

AU - Venktesh, V.

AU - Anand, Abhijit

AU - Anand, Avishek

AU - Setty, Vinay

N1 - Publisher Copyright: © 2024 Owner/Author.

PY - 2024/7/11

Y1 - 2024/7/11

N2 - With the growth of misinformation on the web, automated fact checking has garnered immense interest for detecting growing misinformation and disinformation. Current systems have made significant advancements in handling synthetic claims sourced from Wikipedia, and noteworthy progress has been achieved in addressing real-world claims that are verified by fact-checking organizations as well. We compile and release QuanTemp, a diverse, multi-domain dataset focused exclusively on numerical claims, encompassing comparative, statistical, interval, and temporal aspects, with detailed metadata and an accompanying evidence collection. This addresses the challenge of verifying real-world numerical claims, which are complex and often lack precise information, a gap not filled by existing works that mainly focus on synthetic claims. We evaluate and quantify these gaps in existing solutions for the task of verifying numerical claims. We also evaluate claim decomposition based methods, numerical understanding based natural language inference (NLI) models and our best baselines achieves a macro-F1 of 58.32. This demonstrates that QuanTemp serves as a challenging evaluation set for numerical claim verification.

AB - With the growth of misinformation on the web, automated fact checking has garnered immense interest for detecting growing misinformation and disinformation. Current systems have made significant advancements in handling synthetic claims sourced from Wikipedia, and noteworthy progress has been achieved in addressing real-world claims that are verified by fact-checking organizations as well. We compile and release QuanTemp, a diverse, multi-domain dataset focused exclusively on numerical claims, encompassing comparative, statistical, interval, and temporal aspects, with detailed metadata and an accompanying evidence collection. This addresses the challenge of verifying real-world numerical claims, which are complex and often lack precise information, a gap not filled by existing works that mainly focus on synthetic claims. We evaluate and quantify these gaps in existing solutions for the task of verifying numerical claims. We also evaluate claim decomposition based methods, numerical understanding based natural language inference (NLI) models and our best baselines achieves a macro-F1 of 58.32. This demonstrates that QuanTemp serves as a challenging evaluation set for numerical claim verification.

KW - claim decomposition

KW - fact-checking

KW - numerical claims

UR - http://www.scopus.com/inward/record.url?scp=85200538956&partnerID=8YFLogxK

U2 - 10.48550/arXiv.2403.1716

DO - 10.48550/arXiv.2403.1716

M3 - Conference contribution

AN - SCOPUS:85200538956

SP - 650

EP - 660

BT - Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval

Y2 - 14 July 2024 through 18 July 2024

ER -