Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AMP-122662 [falcon] update the cluster permission required for databricks import #543

Merged
merged 1 commit into from
Mar 5, 2025
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions content/collections/source-catalog/en/databricks.md
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ The service principal you created above requires the following permissions in Da
| ---------- | ------------------------------------------------------------------------------------ | -------------------------------------------------------------- |
| Workspace | Grants access to your Databricks workspace. | *Workspace → <workspace_name> → Permissions → Add permissions* <br/> Add the service principal you create with the User permission, click Save. |
| Table | Grants access to list tables and read data. | *Catalog → pick the catalog→ Permissions → Grant* <br/> Select the `Data Reader` permission (`USE CATALOG`, `USE SCHEMA`, `EXECUTE`, `READ VOLUME`, `SELECT`). |
| Cluster | Grants access to connect to the cluster and run workflows on your behalf | *Compute → All-purpose compute → Edit Permission* <br/> Add the `Add Can Attach To` permission to the service principal. |
| Cluster | Grants access to connect to the cluster and run workflows on your behalf | *Compute → All-purpose compute → Edit Permission* <br/> Add the `Can Restart` permission to the service principal. |
| Export | Enables the service principal to unload your data through spark and export it to S3. | Run the SQL commands below in any notebook: ```GRANT MODIFY ON ANY FILE TO `<service_principal_uuid>`;``` ```GRANT SELECT ON ANY FILE TO `<service_principal_uuid>`;``` |

### Enable CDF on your tables
Expand Down Expand Up @@ -216,4 +216,4 @@ Depending on your company's network policy, you may need to add the following IP
- Amplitude EU IP addresses:
- 3.124.22.25
- 18.157.59.125
- 18.192.47.195
- 18.192.47.195
Loading