Skip to content
This repository has been archived by the owner on Sep 4, 2024. It is now read-only.

Support for pyspark job submit for Databricks Jobs using astro provider #54

Open
neerajvash8 opened this issue Jul 13, 2023 · 0 comments

Comments

@neerajvash8
Copy link

Currently the DatabricksWorkflowTaskGroup only supports creating notebook tasks using the DatabricksNotebookOperator. While I was going through Orchestrate Databricks jobs with Apache Airflow, I came across DatabricksSubmitRunOperator. This is a really a nice functionality as it would allow users to take full advantage of DatabricksWorkflowTaskGroup from astro and ease the development of clean Airflow DAGs.

There are other ways to implement the above using Databricks Connect V2.

The ask is related to the discussion on How to Orchestrate Databricks Jobs Using Airflow where Daniel Imberman(@dimberman) expressed that this functionality is in the roadmap of this project.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant