Skip to content

Commit

Permalink
[Doc] Document budget_policy_id in databricks_pipeline and `datab…
Browse files Browse the repository at this point in the history
…ricks_job` (#4110)

## Changes
<!-- Summary of your changes that are easy to understand -->


`databricks_pipeline` already has this change and for `databricks_job`
we need to merge Go SDK 0.49.0

## Tests
<!-- 
How is this tested? Please see the checklist below and also describe any
other relevant tests
-->

- [ ] `make test` run locally
- [x] relevant change in `docs/` folder
- [ ] covered with integration tests in `internal/acceptance`
- [ ] relevant acceptance tests are passing
- [ ] using Go SDK
  • Loading branch information
alexott authored Oct 16, 2024
1 parent 2bbf251 commit c3ae6f3
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/resources/job.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,7 @@ The resource supports the following arguments:
* `notification_settings` - (Optional) An optional block controlling the notification settings on the job level [documented below](#notification_settings-configuration-block).
* `health` - (Optional) An optional block that specifies the health conditions for the job [documented below](#health-configuration-block).
* `tags` - (Optional) An optional map of the tags associated with the job. See [tags Configuration Map](#tags-configuration-map)
* `budget_policy_id` - (Optional) The ID of the user-specified budget policy to use for this job. If not specified, a default budget policy may be applied when creating or modifying the job.

### task Configuration Block

Expand Down
1 change: 1 addition & 0 deletions docs/resources/pipeline.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,7 @@ The following arguments are supported:
* `target` - The name of a database (in either the Hive metastore or in a UC catalog) for persisting pipeline output data. Configuring the target setting allows you to view and query the pipeline output data from the Databricks UI.
* `edition` - optional name of the [product edition](https://docs.databricks.com/data-engineering/delta-live-tables/delta-live-tables-concepts.html#editions). Supported values are: `CORE`, `PRO`, `ADVANCED` (default). Not required when `serverless` is set to `true`.
* `channel` - optional name of the release channel for Spark version used by DLT pipeline. Supported values are: `CURRENT` (default) and `PREVIEW`.
* `budget_policy_id` - optional string specifying ID of the budget policy for this DLT pipeline.
* `allow_duplicate_names` - Optional boolean flag. If false, deployment will fail if name conflicts with that of another pipeline. default is `false`.
* `deployment` - Deployment type of this pipeline. Supports following attributes:
* `kind` - The deployment method that manages the pipeline.
Expand Down

0 comments on commit c3ae6f3

Please sign in to comment.