CheckThat! Lab at CLEF 2025

Home

Editions

Tasks

Contents

Task 3: Fact-Checking Numerical Claims

Definition

This task focuses on verifying claims with numerical quantities and temporal expressions. Numerical claims are defined as those requiring validation of explicit or implicit quantitative or temporal details. Participants must classify each claim as True, False, or Conflicting based on a short list of evidence. The fact-verification task is available in languages Spanish, Arabic and English

Each claim would be provided with top-k BM25 evidence and participants can choose to carefully select from this evidence set or employ re–ranking approaches to improve fact verification performance. The evidence corpus is collected by pooling using multiple advanced claim decomposition approaches and the top-100 Bm25 evidences are retrieved from this pool to provide diverse perspectives required for claim verification.

Datasets

The dataset is collected from various fact-checking domains through Google Fact-check Explorer API12, complete with detailed metadata and an evidence corpus sourced from the web. Our pipeline will filter out numerical claims for the task. An overview of dataset statistics is shown in Table below.

Language Count
English 15514
Spanish 2082
Arabic 2200

Evaluation

We use macro-averaged F1 scores and classwise F1 scores for evaluating the fact-verification performance.

Submission

All script can be found in GitLab at link ChekThat! Lab Task 3 repository

Leaderboard

TBA

Organizers

  • Vinay Setty, University of Stavanger, Norway

Contact

For queries, please join the Slack channel

Alternatively, please send an email to: clef-factcheck@googlegroups.com