From 176141a7cb4d47be6781cccaa73508cdc71a0961 Mon Sep 17 00:00:00 2001 From: sharmila upadhyaya Date: Wed, 19 Feb 2025 13:26:50 +0100 Subject: [PATCH] Update somd25.html --- 2025/somd25.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/2025/somd25.html b/2025/somd25.html index 19ec82c..f529bff 100644 --- a/2025/somd25.html +++ b/2025/somd25.html @@ -76,7 +76,7 @@

Dataset

We will upload the dataset shortly after the competition begins.

Evaluation

-

We evaluate submissions using the F1 score, a metric that reflects the accuracy and precision of the Named Entity Recognition (NER) and Relation Extraction (RE). We will calculate macro-average F1 score using exact match criteria for each of the two test phases.

+

We evaluate submissions using the F1 score, a metric that reflects the accuracy and precision of the Named Entity Recognition (NER) and Relation Extraction (RE). We will calculate macro-average F1 score using exact match (Nakayama, 2018) criteria for each of the two test phases.

Competition Timeline Overview