From b9acf9157e2a9e1c718c4f01dcebe5b61f059550 Mon Sep 17 00:00:00 2001 From: Dani Palma Date: Tue, 27 Aug 2024 10:15:36 -0300 Subject: [PATCH] Add StarTree Dekaf integration docs --- .../docs/reference/Connectors/dekaf/README.md | 3 +- .../Connectors/dekaf/dekaf-startree.md | 38 +++++++++++++++++++ 2 files changed, 40 insertions(+), 1 deletion(-) create mode 100644 site/docs/reference/Connectors/dekaf/dekaf-startree.md diff --git a/site/docs/reference/Connectors/dekaf/README.md b/site/docs/reference/Connectors/dekaf/README.md index 15aed52f68..91a99e046c 100644 --- a/site/docs/reference/Connectors/dekaf/README.md +++ b/site/docs/reference/Connectors/dekaf/README.md @@ -6,4 +6,5 @@ functionality enables integrations with the Kafka ecosystem. ## Available Dekaf integrations -- [Tinybird](/reference/Connectors/dekaf/dekaf-tinybird) \ No newline at end of file +- [Tinybird](/reference/Connectors/dekaf/dekaf-tinybird) +- [StarTree](/reference/Connectors/dekaf/dekaf-startree) \ No newline at end of file diff --git a/site/docs/reference/Connectors/dekaf/dekaf-startree.md b/site/docs/reference/Connectors/dekaf/dekaf-startree.md new file mode 100644 index 0000000000..191b08e3a8 --- /dev/null +++ b/site/docs/reference/Connectors/dekaf/dekaf-startree.md @@ -0,0 +1,38 @@ +# StarTree + +In this guide, you'll learn how to use Estuary Flow to push data streams to StarTree using the Kafka data source. + +[StarTree](https://startree.ai/) is a real-time analytics platform built on Apache Pinot, designed for performing fast, +low-latency analytics on large-scale data. + +## Prerequisites + +- An Estuary Flow account & collection +- A StarTree account + +## Connecting Estuary Flow to StarTree + +1. **Create a new access token** to use for the StarTree connection. You can generate this token from the Estuary Admin + Dashboard. + + ![Export Dekaf Access Token](https://storage.googleapis.com/estuary-marketing-strapi-uploads/uploads//Group_22_95a85083d4/Group_22_95a85083d4.png) + +2. In the StarTree UI, navigate to the **Data Sources** section and choose **Add New Data Source**. + +3. Select **Kafka** as your data source type. + +4. Enter the following connection details: + + - **Bootstrap Servers**: `dekaf.estuary.dev` + - **Security Protocol**: `SASL_SSL` + - **SASL Mechanism**: `PLAIN` + - **SASL Username**: `{}` (Use your Estuary username or any placeholder if not specified) + - **SASL Password**: `Your generated Estuary Refresh Token` + +5. **Configure Schema Registry**: To decode Avro messages, enable schema registry settings: + + - **Schema Registry URL**: `https://dekaf.estuary.dev` + - **Schema Registry Username**: `{}` (same as SASL Username) + - **Schema Registry Password**: `The same Estuary Refresh Token as above` + +6. Click **Create Connection** to proceed.