From c992713b6e0b9989590d39d4793b0e5f36b327c4 Mon Sep 17 00:00:00 2001 From: sharmila upadhyaya Date: Wed, 19 Feb 2025 13:24:39 +0100 Subject: [PATCH] Update somd25.html --- 2025/somd25.html | 7 ++++++- 1 file changed, 6 insertions(+), 1 deletion(-) diff --git a/2025/somd25.html b/2025/somd25.html index b8a2d97..19ec82c 100644 --- a/2025/somd25.html +++ b/2025/somd25.html @@ -76,7 +76,7 @@

Dataset

We will upload the dataset shortly after the competition begins.

Evaluation

-

We evaluate submissions using the F1 score, a metric that reflects the accuracy and precision of the Named Entity Recognition (NER) and Relation Extraction (RE). We will calculate an exact match macro average F1 score for each of the two test phases.

+

We evaluate submissions using the F1 score, a metric that reflects the accuracy and precision of the Named Entity Recognition (NER) and Relation Extraction (RE). We will calculate macro-average F1 score using exact match criteria for each of the two test phases.

Competition Timeline Overview