Includes a custom implementation of BERT (Bidirectional Encoder Representations from Transformers), finetuned using the NLI (Natural Language Inference) dataset, constructing a classification-based model and a contrastive approach-based model. The three BERT models are evaluated using the STS (Semantic Textual Similarity) benchmark, and the Pearson & Spearman correlation scores are compared.
-
Notifications
You must be signed in to change notification settings - Fork 0
nikxtaco/BERTSentenceEncoder
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published