Skip to content

Commit

Permalink
See open-metadata/OpenMetadata@1dc79bf from refs/heads/main
Browse files Browse the repository at this point in the history
  • Loading branch information
open-metadata committed Dec 20, 2023
1 parent 7e47a09 commit dbfc8cc
Show file tree
Hide file tree
Showing 62 changed files with 179 additions and 148 deletions.
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/athena/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -784,9 +784,10 @@ You can learn more about how to ingest lineage [here](/connectors/ingestion/work
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/athena/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/azuresql/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -499,9 +499,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/azuresql/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/db2/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -502,9 +502,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/db2/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/deltalake/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -282,9 +282,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/deltalake/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/domo-database/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -272,9 +272,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/domo-database/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/dynamodb/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -287,9 +287,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/dynamodb/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/hive/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -521,9 +521,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/hive/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/impala/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -480,9 +480,10 @@ link="/connectors/ingestion/workflows/dbt" /%}
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/impala/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
2 changes: 1 addition & 1 deletion content/v1.1.x/connectors/database/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@ This is the supported list of connectors for Database Services:
- [Postgres](/connectors/database/postgres)
- [Presto](/connectors/database/presto)
- [Redshift](/connectors/database/redshift)
- [Sap Hana](/connectors/database/saphana)
- [Salesforce](/connectors/database/salesforce)
- [Sap Hana](/connectors/database/sap-hana)
- [SingleStore](/connectors/database/singlestore)
- [Snowflake](/connectors/database/snowflake)
- [Trino](/connectors/database/trino)
Expand Down
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/mariadb/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -482,9 +482,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/mariadb/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/mongodb/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -233,9 +233,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/mongodb/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/mysql/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -605,9 +605,10 @@ source:
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/mysql/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/oracle/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -526,9 +526,10 @@ You can learn more about how to ingest lineage [here](/connectors/ingestion/work
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/oracle/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/pinotdb/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -524,9 +524,10 @@ source:
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/pinotdb/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/presto/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -484,9 +484,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/presto/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/salesforce/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -257,9 +257,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/salesforce/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/sap-hana/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -513,9 +513,10 @@ Note how instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/sap-hana/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/singlestore/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -476,9 +476,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/singlestore/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/trino/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -559,9 +559,10 @@ source:
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/trino/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/database/vertica/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -521,9 +521,10 @@ Note now instead of running `ingest`, we are using the `profile` command to sele
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/vertica/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
1 change: 1 addition & 0 deletions content/v1.1.x/connectors/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@ the following docs to run the Ingestion Framework in any orchestrator externally
- [PinotDB](/connectors/database/pinotdb)
- [Redshift](/connectors/database/redshift)
- [Salesforce](/connectors/database/salesforce)
- [Sap Hana](/connectors/database/sap-hana)
- [SingleStore](/connectors/database/singlestore)
- [Snowflake](/connectors/database/snowflake)
- [SQLite](/connectors/database/sqlite)
Expand Down
4 changes: 2 additions & 2 deletions content/v1.1.x/connectors/ingestion/deployment/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,8 +62,8 @@ While the endpoints are directly defined in the `IngestionPipelineResource`, the
that decouples how OpenMetadata communicates with the Orchestrator, as different external systems will need different
calls and data to be sent.

- You can find the `PipelineServiceClient` abstraction [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/util/PipelineServiceClient.java),
- And the `AirflowRESTClient` implementation [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/airflow/AirflowRESTClient.java).
- You can find the `PipelineServiceClient` abstraction [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/java/org/openmetadata/sdk/PipelineServiceClient.java),
- And the `AirflowRESTClient` implementation [here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-service/src/main/java/org/openmetadata/service/clients/pipeline/airflow/AirflowRESTClient.java).

The clients that implement the abstractions from the `PipelineServiceClient` are merely a translation layer between the
information received in the shape of an `IngestionPipeline` Entity, and the specific requirements of each Orchestrator.
Expand Down
12 changes: 6 additions & 6 deletions content/v1.1.x/connectors/ingestion/lineage/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,12 +119,12 @@ as well). You might also need to validate if the query logs are available in the

You can check the queries being used here:

- [BigQuery](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L428)
- [Snowflake](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L197)
- [MSSQL](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L350)
- [Redshift](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L18)
- [Clickhouse](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L376)
- [Postgres](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/utils/sql_queries.py#L467)
- [BigQuery](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/bigquery/queries.py)
- [Snowflake](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/snowflake/queries.py)
- [MSSQL](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/mssql/queries.py)
- [Redshift](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/redshift/queries.py)
- [Clickhouse](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/clickhouse/queries.py)
- [Postgres](https://github.com/open-metadata/OpenMetadata/blob/main/ingestion/src/metadata/ingestion/source/database/postgres/queries.py)

By default, we apply a result limit of 1000 records. You might also need to increase that for databases with big volumes
of queries.
Expand Down
2 changes: 1 addition & 1 deletion content/v1.1.x/connectors/ml-model/mlflow/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.

The workflow is modeled around the following
[JSON Schema](https://github.com/open-metadata/OpenMetadatablob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/mlmodelServiceMetadataPipeline.json)

### 1. Define the YAML Config

Expand Down
4 changes: 2 additions & 2 deletions content/v1.1.x/connectors/ml-model/sagemaker/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,15 +56,15 @@ pip3 install "openmetadata-ingestion[sagemaker]"
## Metadata Ingestion

All connectors are defined as JSON Schemas.
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/mlmodel/sagemakerConnection.json)
[Here](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/entity/services/connections/mlmodel/sageMakerConnection.json)
you can find the structure to create a connection to Sagemaker.

In order to create and run a Metadata Ingestion workflow, we will follow
the steps to create a YAML configuration able to connect to the source,
process the Entities if needed, and reach the OpenMetadata server.

The workflow is modeled around the following
[JSON Schema](https://github.com/open-metadata/OpenMetadatablob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/workflow.json)
[JSON Schema](https://github.com/open-metadata/OpenMetadata/blob/main/openmetadata-spec/src/main/resources/json/schema/metadataIngestion/mlmodelServiceMetadataPipeline.json)

### 1. Define the YAML Config

Expand Down
9 changes: 5 additions & 4 deletions content/v1.1.x/connectors/storage/s3/yaml.md
Original file line number Diff line number Diff line change
Expand Up @@ -338,9 +338,10 @@ you will be able to extract metadata from different sources.
{% tilesContainer %}

{% tile
title="Ingest with Airflow"
description="Configure the ingestion using Airflow SDK"
link="/connectors/database/athena/airflow"
/ %}
icon="mediation"
title="Configure Ingestion Externally"
description="Deploy, configure, and manage the ingestion workflows externally."
link="/deployment/ingestion"
/ %}

{% /tilesContainer %}
2 changes: 1 addition & 1 deletion content/v1.1.x/deployment/bare-metal/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -237,7 +237,7 @@ installation.

## Next Steps

1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.
2 changes: 1 addition & 1 deletion content/v1.1.x/deployment/docker/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -367,7 +367,7 @@ installation.

## Next Steps

1. Visit the [Features](/releases/features) overview page and explore the OpenMetadata UI.
1. Refer the [How-to Guides](/how-to-guides) for an overview of all the features in OpenMetadata.
2. Visit the [Connectors](/connectors) documentation to see what services you can integrate with
OpenMetadata.
3. Visit the [API](/swagger.html) documentation and explore the rich set of OpenMetadata APIs.
Loading

0 comments on commit dbfc8cc

Please sign in to comment.