From 59e9524b7ffdaa09061ae6d178c355789817bdc1 Mon Sep 17 00:00:00 2001 From: Eric Alcaide Date: Sat, 11 Nov 2023 11:36:51 +0100 Subject: [PATCH] Update README.md --- docs/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/README.md b/docs/README.md index 1dc8861..e1c0b95 100644 --- a/docs/README.md +++ b/docs/README.md @@ -2,7 +2,7 @@ # RWKV Language Model -RWKV (pronounced as RwaKuv) is an RNN with GPT-level LLM performance, which can also be directly trained like a GPT transformer (parallelizable). +RWKV (pronounced as RWaKuV) is an RNN with GPT-level LLM performance, which can also be directly trained like a GPT transformer (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, fast training, saves VRAM, "infinite" ctxlen, and free sentence embedding. Moreover it's 100% attention-free.