-
Notifications
You must be signed in to change notification settings - Fork 14.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upgrade databricks provider dependency #43272
Open
dcmshi
wants to merge
5
commits into
apache:main
Choose a base branch
from
dcmshi:dshi/upgrade_databricks_prodiver_dependency_version_patch_mypy
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+48
−46
Open
Changes from all commits
Commits
Show all changes
5 commits
Select commit
Hold shift + click to select a range
445a275
update databricks-sql-connector to 3.0.0 for databricks provider
rubanolha 1e73f7f
rename Row definition for no-redef error
dcmshi 37573e6
fix: patches mypy errors for typing
dcmshi 6da7583
fix: fixes static checks and moves Row import
dcmshi 4886f4a
fix: patches Row typing errors when databricks sql imported
dcmshi File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How are changes to snowflake tests related to this PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Snowflake test module is using the
Row
class fromdatabricks.sql.types
[0], which triggered the mypy checks on the module when trying to upgrade the databricks-sql-connecter package. I just went and patched the errors mypy was flagging, since I think refactoring the module to not use the Row class is probably it's own change there, but open to suggestions if this isn't the way we'd like to go.[0] https://github.com/apache/airflow/pull/43272/files/175059dfe5cfb1f64297a8e26bcfe278edd7ebfb#diff-6fa19a213d772197807bb12413c4731e6b8a001a241f7f9009e152a1aac1f066R33
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wait. What??
I am not near my laptop so I can't investigate this right now but this sounds very odd. We need to look into the commit that added it to see what was the reason for creating such odd coupling.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is the original PR which introduces the import[0], and then this PR adds logic to mock the import later[1]. It looks to be testing the return of a SQLExecuteQueryOperator but the test_exec_success doesn't seem to be a Snowflake specific test?
[0] #28006
[1] #38074
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I dont think this is right.
Snowflake tests shuld not use databricks SDK
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah. I think this is a mistake someone used databricks row in the snowflake tests - they "match" as they represent the output of
dbapi
that has generally similar structure, but this should be, I think fixed to have either some generic Row structure used or creating of own Row class in the tests.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. That's the way to go. Likely what would be great is to add similar tests that we have in snowflake now to databricks tests if possible, using databricks row, but to change the snowflake one to not use the databricks row