diff --git a/content/docs/02-guides/04-r1.mdx b/content/docs/02-guides/04-r1.mdx
index bda108b09407..b837cd4ed085 100644
--- a/content/docs/02-guides/04-r1.mdx
+++ b/content/docs/02-guides/04-r1.mdx
@@ -109,12 +109,13 @@ const { reasoning, text } = await generateText({
You can use DeepSeek R1 with the AI SDK through various providers. Here's a comparison of the providers that support DeepSeek R1:
-| Provider | Model ID | Reasoning Tokens |
-| -------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------- | ------------------- |
-| [DeepSeek](/providers/ai-sdk-providers/deepseek) | [`deepseek-reasoner`](https://api-docs.deepseek.com/guides/reasoning_model) | |
-| [Fireworks](/providers/ai-sdk-providers/fireworks) | [`accounts/fireworks/models/deepseek-r1`](https://fireworks.ai/models/fireworks/deepseek-r1) | Requires Middleware |
-| [Groq](/providers/ai-sdk-providers/groq) | [`deepseek-r1-distill-llama-70b`](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-70B) | Requires Middleware |
-| [Azure](/providers/ai-sdk-providers/azure) | [`DeepSeek-R1`](https://ai.azure.com/explore/models/DeepSeek-R1/version/1/registry/azureml-deepseek#code-samples) | Requires Middleware |
+| Provider | Model ID | Reasoning Tokens |
+| ----------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------- | ------------------- |
+| [DeepSeek](/providers/ai-sdk-providers/deepseek) | [`deepseek-reasoner`](https://api-docs.deepseek.com/guides/reasoning_model) | |
+| [Fireworks](/providers/ai-sdk-providers/fireworks) | [`accounts/fireworks/models/deepseek-r1`](https://fireworks.ai/models/fireworks/deepseek-r1) | Requires Middleware |
+| [Groq](/providers/ai-sdk-providers/groq) | [`deepseek-r1-distill-llama-70b`](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-70B) | Requires Middleware |
+| [Azure](/providers/ai-sdk-providers/azure) | [`DeepSeek-R1`](https://ai.azure.com/explore/models/DeepSeek-R1/version/1/registry/azureml-deepseek#code-samples) | Requires Middleware |
+| [Together AI](/providers/ai-sdk-providers/togetherai) | [`deepseek-ai/DeepSeek-R1`](https://www.together.ai/models/deepseek-r1) | Requires Middleware |
### Building Interactive Interfaces
diff --git a/content/providers/01-ai-sdk-providers/24-togetherai.mdx b/content/providers/01-ai-sdk-providers/24-togetherai.mdx
index 419967c7f851..f178ede235bc 100644
--- a/content/providers/01-ai-sdk-providers/24-togetherai.mdx
+++ b/content/providers/01-ai-sdk-providers/24-togetherai.mdx
@@ -74,6 +74,23 @@ You can create [Together.ai models](https://docs.together.ai/docs/serverless-mod
const model = togetherai('google/gemma-2-9b-it');
```
+### Reasoning Models
+
+Together.ai exposes the thinking of `deepseek-ai/DeepSeek-R1` in the generated text using the `` tag.
+You can use the `extractReasoningMiddleware` to extract this reasoning and expose it as a `reasoning` property on the result:
+
+```ts
+import { togetherai } from '@ai-sdk/togetherai';
+import { wrapLanguageModel, extractReasoningMiddleware } from 'ai';
+
+const enhancedModel = wrapLanguageModel({
+ model: togetherai('deepseek-ai/DeepSeek-R1'),
+ middleware: extractReasoningMiddleware({ tagName: 'think' }),
+});
+```
+
+You can then use that enhanced model in functions like `generateText` and `streamText`.
+
### Example
You can use Together.ai language models to generate text with the `generateText` function: