Skip to content

Commit

Permalink
Merge branch 'main' into DAT-17989
Browse files Browse the repository at this point in the history
  • Loading branch information
PavloTytarchuk authored Aug 19, 2024
2 parents 3b12076 + ad57a34 commit 0e2b3f8
Show file tree
Hide file tree
Showing 90 changed files with 1,159 additions and 326 deletions.
23 changes: 20 additions & 3 deletions .github/workflows/lth.yml
Original file line number Diff line number Diff line change
@@ -1,11 +1,16 @@
name: Liquibase Test Harness

on:
workflow_dispatch:
pull_request:
push:
branches:
- main

# static value to the workflow group to ensure only one run of the workflow can be in progress at a time.
concurrency:
group: liquibase-test-harness
cancel-in-progress: false

jobs:
liquibase-test-harness:
name: Liquibase Test Harness
Expand Down Expand Up @@ -57,7 +62,7 @@ jobs:
run: mvn -B -ntp -Dmaven.test.skip package

- name: Run ${{ matrix.liquibase-support-level }} Liquibase Test Harness # Run the Liquibase test harness at each test level
continue-on-error: true # Continue to run the action even if the previous steps fail
if: always() # Run the action even if the previous steps fail
run: mvn -B -ntp -DdbPassword=${{env.TF_VAR_DBX_TOKEN}} -DdbUrl='${{env.DATABRICKS_URL}}' -Dtest=liquibase.ext.databricks.${{ matrix.liquibase-support-level }}ExtensionHarnessTestSuite test # Run the Liquibase test harness at each test level

- name: Test Reporter # Generate a test report using the Test Reporter action
Expand All @@ -72,4 +77,16 @@ jobs:
- name: Stop test database
if: always() # Always destroy, even if the previous steps fail
working-directory: src/test/terraform
run: terraform destroy -auto-approve
run: |
set -e
TERRAFORM_OUTPUT=$(terraform show -json)
if [ -z "$TERRAFORM_OUTPUT" ]; then
echo "Terraform output is empty. Skipping removal."
else
SCHEMA_EXISTS=$(echo $TERRAFORM_OUTPUT | jq -r '.values.root_module.resources[] | select(.address == "databricks_schema.test_harness") | .values.name')
if [ "$SCHEMA_EXISTS" == "liquibase_harness_test_ds" ]; then
terraform destroy -auto-approve
else
echo "Schema does not exist. Skipping removal."
fi
fi
28 changes: 13 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ If hive_metastore is used, this is not tested and may not provide all the below
21. [x] Change Data Test: apply insert
22. [x] Change Data Test: apply loadData
23. [x] Change Data Test: apply loadDataUpdate
24. [ ] Add/Drop Check Constraints - TO DO: Need to create snapshot generator but the change type works
24. [x] Add/Drop Check Constraints - supported but not returned in snapshot

### Advanced
1. [x] addColumn snapshot
Expand All @@ -58,16 +58,17 @@ If hive_metastore is used, this is not tested and may not provide all the below
5. [x] createTable snapshot
6. [x] createView snapshot
7. [x] generateChangelog -
2. [x] addUniqueConstraint - not supported
3. [x] createIndex - Not Supported, use changeClusterColumns change type for datbricks to Map to CLUSTER BY ALTER TABLE statements for Delta Tables

8. [x] addUniqueConstraint - not supported
9. [x] createIndex - Not Supported, use changeClusterColumns change type for datbricks to Map to CLUSTER BY ALTER TABLE statements for Delta Tables
10. [x] alterTableProperties
11. [x] alterCluster

### Databricks Specific:
1. [x] OPTIMIZE - optimizeTable - optimize with zorderCols options - <b> SUPPORTED </b> in Contributed Harness
2. [x] CLUSTER BY (DDL) - createClusteredTable - createTable with clusterColumns as additional option for liquid - <b> SUPPORTED </b> in Contributed Harness
3. [x] ANALYZE TABLE - analyzeTable - change type with compute stats column options - <b> SUPPORTED </b> in Contributed Harness
4. [x] VACUUM - vacuumTable - change type with retentionHours parameter (default is 168) - <b> SUPPORTED </b> in Contributed Harness
5. [ ] ALTER CLUSTER KEY - changeClusterColumns - change type that will be used until index change types are mapped with CLUSTER BY columns for snapshot purposes - TO DO
5. [x] ALTER CLUSTER KEY - alterCluster - change type that will be used until index change types are mapped with CLUSTER BY columns for snapshot purposes


## Remaining Required Change Types to Finish in Base/Contributed
Expand All @@ -88,16 +89,13 @@ The remaining other change types are not relevant to Databricks and have been ma
2. MERGE
3. RESTORE VERSION AS OF
4. ANALYZE TABLE - Code Complete - Adding Tests - Cody Davis
5. SET TBL PROPERTIES - (Defaults are in createTable change type with min required table props to support Liquibase)
6. CLONE
7. BLOOM FILTERS - Maybe do not support, CLUSTER BY should be the primary indexing mechanism long term
8. OPTIMIZE / ZORDER - Code Complete - Adding Tests - Cody Davis
9. VACUUM - Code Complete - Adding Tests - Cody Davis
10. SYNC IDENTITY
11. VOLUMES
12. GRANT / REVOKE statements
13. CLUSTER BY - Similar to Indexes, important to support as a create table / alter table set of change types (params in createTable change), addClusterKey new change type to ALTER TABle

5. CLONE
6. BLOOM FILTERS - Maybe do not support, CLUSTER BY should be the primary indexing mechanism long term
7. OPTIMIZE / ZORDER - Code Complete - Adding Tests - Cody Davis
8. VACUUM - Code Complete - Adding Tests - Cody Davis
9. SYNC IDENTITY
10. VOLUMES
11. GRANT / REVOKE statements


## How to use the Liquibase-Databricks Extension
Expand Down
18 changes: 12 additions & 6 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<liquibase.version>4.28.0</liquibase.version>
<liquibase.version>4.29.1</liquibase.version>
<sonar.organization>liquibase</sonar.organization>
<sonar.projectKey>${sonar.organization}_${project.artifactId}</sonar.projectKey>
<sonar.projectName>${project.name}</sonar.projectName>
Expand Down Expand Up @@ -96,7 +96,7 @@
<dependency>
<groupId>com.databricks</groupId>
<artifactId>databricks-jdbc</artifactId>
<version>2.6.38</version>
<version>2.6.40</version>
<scope>test</scope>
</dependency>

Expand All @@ -122,9 +122,15 @@
<dependency>
<groupId>com.databricks</groupId>
<artifactId>databricks-jdbc</artifactId>
<version>2.6.38</version>
<version>2.6.40</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.34</version>
<scope>provided</scope>
</dependency>
</dependencies>

<build>
Expand Down Expand Up @@ -170,7 +176,7 @@
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.7.0</version>
<version>3.8.0</version>
<configuration>
<attach>true</attach>
<author>false</author>
Expand Down Expand Up @@ -210,7 +216,7 @@
<plugin>
<groupId>org.codehaus.gmavenplus</groupId>
<artifactId>gmavenplus-plugin</artifactId>
<version>2.1.0</version>
<version>3.0.2</version>
<executions>
<execution>
<goals>
Expand Down Expand Up @@ -240,7 +246,7 @@
<plugin>
<groupId>org.sonarsource.scanner.maven</groupId>
<artifactId>sonar-maven-plugin</artifactId>
<version>3.11.0.3922</version>
<version>4.0.0.4121</version>
</plugin>

<plugin>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,14 @@
import liquibase.database.Database;
import liquibase.ext.databricks.change.dropCheckConstraint.DropCheckConstraintChangeDatabricks;
import liquibase.ext.databricks.database.DatabricksDatabase;
import liquibase.servicelocator.PrioritizedService;
import liquibase.statement.SqlStatement;

import java.text.MessageFormat;
import liquibase.ext.databricks.database.DatabricksDatabase;

@DatabaseChange(name = "addCheckConstraint",
description = "Adds check constraint to Delta Table",
priority = DatabricksDatabase.PRIORITY_DEFAULT + 200,
priority = PrioritizedService.PRIORITY_DATABASE,
appliesTo = {"column"}
)
public class AddCheckConstraintChangeDatabricks extends AbstractChange {
Expand All @@ -23,35 +24,38 @@ public class AddCheckConstraintChangeDatabricks extends AbstractChange {

private String constraintBody;


@Override
public boolean supports(Database database) {
return database instanceof DatabricksDatabase;
}

public String getCatalogName() {
return catalogName;
}

public void setCatalogName (String catalogName) {
public void setCatalogName(String catalogName) {
this.catalogName = catalogName;
}

public String getTableName() {
return tableName;
}

public void setTableName (String tableName) {
public void setTableName(String tableName) {
this.tableName = tableName;
}

public String getSchemaName() {
return schemaName;
}

public void setSchemaName (String schemaName) {
public void setSchemaName(String schemaName) {
this.schemaName = schemaName;
}


// Name of Delta Table Constraint
@DatabaseChangeProperty(
description = "Name of the check constraint"
)
@DatabaseChangeProperty(description = "Name of the check constraint")
public String getConstraintName() {
return this.constraintName;
}
Expand All @@ -60,9 +64,7 @@ public void setConstraintName(String name) {
this.constraintName = name;
}


// The is the SQL expression involving the contraint

// This is the SQL expression involving the constraint
@DatabaseChangeProperty(
serializationType = SerializationType.DIRECT_VALUE
)
Expand All @@ -76,7 +78,8 @@ public void setConstraintBody(String body) {

@Override
public String getConfirmationMessage() {
return MessageFormat.format("{0}.{1}.{2} successfully Added check constraint {3}.", getCatalogName(), getSchemaName(), getTableName(), getConstraintName());
return MessageFormat.format("{0}.{1}.{2} successfully Added check constraint {3}.", getCatalogName(), getSchemaName(), getTableName(),
getConstraintName());
}

protected Change[] createInverses() {
Expand All @@ -100,6 +103,6 @@ public SqlStatement[] generateStatements(Database database) {
statement.setConstraintName(getConstraintName());
statement.setConstraintBody(getConstraintBody());

return new SqlStatement[] {statement};
return new SqlStatement[]{statement};
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,6 @@ public class AddCheckConstraintStatementDatabricks extends AbstractSqlStatement
private String tableName;
private String constraintName;
private String constraintBody;
private boolean validate = true;
private boolean disabled;


public String getCatalogName() {
Expand Down
Original file line number Diff line number Diff line change
@@ -1,11 +1,11 @@
package liquibase.ext.databricks.change.addForeignKeyConstraint;

import liquibase.ext.databricks.database.DatabricksDatabase;

import liquibase.change.*;
import liquibase.database.Database;
import liquibase.database.DatabaseFactory;
import liquibase.exception.UnexpectedLiquibaseException;
import liquibase.ext.databricks.database.DatabricksDatabase;
import liquibase.servicelocator.PrioritizedService;
import liquibase.snapshot.SnapshotGeneratorFactory;
import liquibase.statement.SqlStatement;
import liquibase.statement.core.AddForeignKeyConstraintStatement;
Expand All @@ -23,8 +23,9 @@
*/
@DatabaseChange(name = "addForeignKeyConstraint",
description = "Adds a foreign key constraint to an existing column",
priority = DatabricksDatabase.PRIORITY_DATABASE,
priority = PrioritizedService.PRIORITY_DATABASE,
appliesTo = "column")
//TODO this class need refactoring as it copies parent class instead of properly inheriting it.
public class AddForeignKeyConstraintChangeDatabricks extends AddForeignKeyConstraintChange {

private String baseTableCatalogName;
Expand All @@ -46,6 +47,10 @@ public class AddForeignKeyConstraintChangeDatabricks extends AddForeignKeyConstr
private String onUpdate;
private String onDelete;

@Override
public boolean supports(Database database) {
return database instanceof DatabricksDatabase;
}

@Override
protected String[] createSupportedDatabasesMetaData(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,37 +6,27 @@

import liquibase.change.core.*;
import liquibase.ext.databricks.database.DatabricksDatabase;
import liquibase.ext.databricks.change.createTable.CreateTableStatementDatabricks;
import liquibase.ext.databricks.change.createTable.CreateTableChangeDatabricks;
import liquibase.Scope;
import liquibase.change.*;
import liquibase.database.Database;
import liquibase.database.core.DB2Database;
import liquibase.database.core.Db2zDatabase;
import liquibase.database.core.HsqlDatabase;
import liquibase.database.core.InformixDatabase;
import liquibase.database.core.MSSQLDatabase;
import liquibase.database.core.OracleDatabase;
import liquibase.database.core.SybaseASADatabase;
import liquibase.datatype.DataTypeFactory;
import liquibase.exception.ValidationErrors;
import liquibase.servicelocator.PrioritizedService;
import liquibase.snapshot.SnapshotGeneratorFactory;
import liquibase.statement.NotNullConstraint;
import liquibase.statement.SqlStatement;
import liquibase.statement.core.CreateTableStatement;
import liquibase.statement.core.RawSqlStatement;
import liquibase.statement.core.ReorganizeTableStatement;
import liquibase.structure.core.Column;
import liquibase.structure.core.ForeignKey;
import liquibase.structure.core.Table;
import liquibase.change.core.AddLookupTableChange;
import static liquibase.statement.SqlStatement.EMPTY_SQL_STATEMENT;

/**
* Extracts data from an existing column to create a lookup table.
* A foreign key is created between the old column and the new lookup table.
*/
@DatabaseChange(name = "addLookupTable", priority = DatabricksDatabase.PRIORITY_DATABASE + 500, appliesTo = "column",
//TODO this class need refactoring as it copies parent class instead of properly inheriting it.
@DatabaseChange(name = "addLookupTable", priority = PrioritizedService.PRIORITY_DATABASE, appliesTo = "column",
description = "Creates a lookup table containing values stored in a column and creates a foreign key to the new table.")
public class AddLookupTableChangeDatabricks extends AddLookupTableChange {

Expand Down
Loading

0 comments on commit 0e2b3f8

Please sign in to comment.