Skip to content
This repository has been archived by the owner on Sep 4, 2024. It is now read-only.

Marking a running DAG/task as failed has no effect on the ongoing Databricks job run #1

Open
pankajkoti opened this issue Mar 10, 2023 · 1 comment
Milestone

Comments

@pankajkoti
Copy link
Collaborator

Within Airflow UI, when a Databricks workflow job is run and a DAG or task is marked failed while the job is still running, it gets marked failed within Airflow, but the ongoing Databricks job run is not cancelled/killed and continues processing.

@pankajkoti pankajkoti added this to the 0.2.0 milestone Mar 10, 2023
@SenthilMalli
Copy link

Am also having this issue, do we have any resolution for this?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants