Skip to content

Commit

Permalink
[FLINK-34741][cdc][docs] Translate get-started page for Flink CDC doc…
Browse files Browse the repository at this point in the history
… to Chinese

This closes #3175.
  • Loading branch information
loserwang1024 authored Mar 20, 2024
1 parent bacb6a3 commit 43a5887
Show file tree
Hide file tree
Showing 5 changed files with 214 additions and 234 deletions.
67 changes: 28 additions & 39 deletions docs/content.zh/docs/get-started/introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,28 +24,23 @@ specific language governing permissions and limitations
under the License.
-->

# Welcome to Flink CDC 🎉
# 欢迎使用 Flink CDC 🎉

Flink CDC is a streaming data integration tool that aims to provide users with
a more robust API. It allows users to describe their ETL pipeline logic via YAML
elegantly and help users automatically generating customized Flink operators and
submitting job. Flink CDC prioritizes optimizing the task submission process and
offers enhanced functionalities such as schema evolution, data transformation,
full database synchronization and exactly-once semantic.
Flink CDC 是一个基于流的数据集成工具,旨在为用户提供一套功能更加全面的编程接口(API)。
该工具使得用户能够以 YAML 配置文件的形式,优雅地定义其 ETL(Extract, Transform, Load)流程,并协助用户自动化生成定制化的 Flink 算子并且提交 Flink 作业。
Flink CDC 在任务提交过程中进行了优化,并且增加了一些高级特性,如表结构变更自动同步(Schema Evolution)、数据转换(Data Transformation)、整库同步(Full Database Synchronization)以及 精确一次(Exactly-once)语义。

Deeply integrated with and powered by Apache Flink, Flink CDC provides:
Flink CDC 深度集成并由 Apache Flink 驱动,提供以下核心功能:
* ✅ 端到端的数据集成框架
* ✅ 为数据集成的用户提供了易于构建作业的 API
* ✅ 支持在 Source 和 Sink 中处理多个表
* ✅ 整库同步
* ✅具备表结构变更自动同步的能力(Schema Evolution),

* ✅ End-to-end data integration framework
* ✅ API for data integration users to build jobs easily
* ✅ Multi-table support in Source / Sink
* ✅ Synchronization of entire databases
* ✅ Schema evolution capability
## 如何使用 Flink CDC

## How to Use Flink CDC
Flink CDC 提供了基于 `YAML` 格式的用户 API,更适合于数据集成场景。以下是一个 `YAML` 文件的示例,它定义了一个数据管道(Pipeline),该Pipeline从 MySQL 捕获实时变更,并将它们同步到 Apache Doris:

Flink CDC provides an YAML-formatted user API that more suitable for data
integration scenarios. Here's an example YAML file defining a data pipeline that
ingests real-time changes from MySQL, and synchronize them to Apache Doris:

```yaml
source:
Expand All @@ -71,27 +66,23 @@ pipeline:
parallelism: 2
```
By submitting the YAML file with `flink-cdc.sh`, a Flink job will be compiled
and deployed to a designated Flink cluster. Please refer to [Core Concept]({{<
ref "docs/core-concept/data-pipeline" >}}) to get full documentation of all
supported functionalities of a pipeline.
通过使用 `flink-cdc.sh` 提交 YAML 文件,一个 Flink 作业将会被编译并部署到指定的 Flink 集群。
请参考 [核心概念]({{<ref "docs/core-concept/data-pipeline" >}}) 以获取 Pipeline 支持的所有功能的完整文档说明。

## Write Your First Flink CDC Pipeline
## 编写你的第一个 Flink CDC Pipeline

Explore Flink CDC document to get hands on your first real-time data integration
pipeline:
浏览 Flink CDC 文档,开始创建您的第一个实时数据集成管道(Pipeline)。

### Quickstart
### 快速开始

Check out the quickstart guide to learn how to establish a Flink CDC pipeline:
查看快速入门指南,了解如何建立一个 Flink CDC Pipeline:

- [MySQL to Apache Doris]({{< ref "docs/get-started/quickstart/mysql-to-doris" >}})
- [MySQL to StarRocks]({{< ref "docs/get-started/quickstart/mysql-to-starrocks" >}})

### Understand Core Concepts
### 理解核心概念

Get familiar with core concepts we introduced in Flink CDC and try to build
more complex pipelines:
熟悉我们在 Flink CDC 中引入的核心概念,并尝试构建更复杂的数据Pipeline:

- [Data Pipeline]({{< ref "docs/core-concept/data-pipeline" >}})
- [Data Source]({{< ref "docs/core-concept/data-source" >}})
Expand All @@ -100,21 +91,19 @@ more complex pipelines:
- [Transform]({{< ref "docs/core-concept/transform" >}})
- [Route]({{< ref "docs/core-concept/route" >}})

### Submit Pipeline to Flink Cluster
### 提交 Pipeline Flink 集群

Learn how to submit the pipeline to Flink cluster running on different
deployment mode:
了解如何将 Pipeline 提交到运行在不同部署模式下的 Flink 集群:

- [standalone]({{< ref "docs/deployment/standalone" >}})
- [Kubernetes]({{< ref "docs/deployment/kubernetes" >}})
- [YARN]({{< ref "docs/deployment/yarn" >}})

## Development and Contribution
## 开发与贡献

如果您想要将 Flink CDC 连接到您定制化的外部系统,或者想要为框架本身做出贡献,以下这些部分可能会有所帮助:

If you want to connect Flink CDC to your customized external system, or
contributing to the framework itself, these sections could be helpful:
- [理解 Flink CDC API]({{< ref "docs/developer-guide/understand-flink-cdc-api" >}}),开发您自己的Flink CDC 连接器。
- 了解如何[向 Flink CDC 提交贡献]({{< ref "docs/developer-guide/contribute-to-flink-cdc" >}})
- 查看 Flink CDC 使用的[许可证]({{< ref "docs/developer-guide/licenses" >}})

- Understand [Flink CDC APIs]({{< ref "docs/developer-guide/understand-flink-cdc-api" >}})
to develop your own Flink CDC connector
- Learn about how to [contributing to Flink CDC]({{< ref "docs/developer-guide/contribute-to-flink-cdc" >}})
- Check out [licenses]({{< ref "docs/developer-guide/licenses" >}}) used by Flink CDC
Loading

0 comments on commit 43a5887

Please sign in to comment.