-
Notifications
You must be signed in to change notification settings - Fork 170
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Oracle to airflow->cosmos profile map #1190
base: main
Are you sure you want to change the base?
Conversation
✅ Deploy Preview for sunny-pastelito-5ecb04 canceled.
|
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1190 +/- ##
==========================================
- Coverage 95.72% 95.43% -0.30%
==========================================
Files 64 67 +3
Lines 3672 3788 +116
==========================================
+ Hits 3515 3615 +100
- Misses 157 173 +16
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
believe it's worth adding the missing tests to avoid regression. Otherwise, it looks great!
Working on that. Should have something in the next few days. |
Hi @slords, thanks a lot for working on this feature! |
…#1221) This adds a privacy notice and website analytics to the Cosmos readme and auto-generated docs. Note that while you cannot explicitly opt out of website analytics for the publicly hosted readme (and docs), Scarf respects browser DND. If that is set via the browser, telemetry for that user will not be sent to Scarf. Scarf privacy policy: https://about.scarf.sh/privacy-policy Astronomer privacy policy: https://www.astronomer.io/privacy/
Added new `GCP_CLOUD_RUN_JOB` execution mode that triggers Google Cloud Platform's Cloud Run Job instance with dbt model in it. It extends Airflow's `CloudRunExecuteJobOperator` and overrides Cloud Run Job's container with dbt command generated by cosmos. Note: `CloudRunExecuteJobOperator` has `container_overrides` parameter implemented in `apache-airflow-providers-google==10.13.0` which is supported by `airflow >=2.6.0`. Resolves astronomer#1149 Co-authored-by: Agata Zalewska <[email protected]> Co-authored-by: Tatiana Al-Chueyr <[email protected]>
…1198) In projects containing models with names like the following, dataset creation fails, and an error occurs during execution. ```txt └── dbt └── my_project └── models ├── 日本語名モデル.sql └── 日本語名モデル.yml ``` ``` File "/home/airflow/.local/lib/python3.12/site-packages/airflow/datasets/__init__.py", line 78, in _sanitize_uri raise ValueError("Dataset URI must only consist of ASCII characters") ValueError: Dataset URI must only consist of ASCII characters ``` To support model names with multibyte characters, it might be good to URL encode the names. closes: astronomer#1197 Co-authored-by: Tatiana Al-Chueyr <[email protected]>
Hey @slords, we're really close to completing this feature - it would be great to release it as part of Cosmos 1.8. Please, do let us know if you'd like any additional support. |
Description
Add the ability to map Oracle connections from airflow to comsos
Related Issue(s)
closes #1189
Breaking Change?
None that I'm aware of.
Checklist