You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jun 25, 2024. It is now read-only.
Thanks for an awesome tool! I did spend way too long though trying to get the tool to connect to the databricks instance via Devops. Now that it's working, I thought I might share it in the docs.
For the below use case, I wanted to deploy any code from a folder (/azure/databricks) on main branch to a folder (/repo) on my databricks instance, and refresh that folder on any change to main. It uses MANAGED connection type, so Databricks will automatically provision the service user - vital for real CI across multiple environments.
The below assumes you have an Azure Resource Manger connection set up in Devops that points to the Azure resource group, and that you have granted the ARM Devops Service Principal 'Contributor' level permissions in Databricks.
Where should I stick this example in the repo?
trigger:
- mainjobs:
- job: Deploy_notebooks_to_dbworkspace pool:
vmImage: 'windows-latest'# Haven't tried on Linuxsteps:
- task: AzureCLI@2inputs:
azureSubscription: 'my-devops-connection-name'# Name of devops service connection to AzurescriptType: psscriptLocation: inlineScriptinlineScript: | Install-Module -Name azure.databricks.cicd.tools -Scope CurrentUser -force Import-Module -Name azure.databricks.cicd.tools Connect-Databricks -Region "australiaeast" -ApplicationId "$env:servicePrincipalId" -Secret "$env:servicePrincipalKey" -ResourceGroupName "my-azure-rg-name" -SubscriptionId "$(az account show --query id --output tsv)" -TenantId "$env:tenantId" -WorkspaceName "my-Correctly-Capitalised-DB-workspace-name" Import-DatabricksFolder -Region australiaeast -LocalPath '$(Build.SourcesDirectory)/azure/databricks' -DatabricksPath '/repo' -CleanaddSpnToEnvironment: true # Injects $env:servicePrincipalId, $env:servicePrincipalKey, $env:tenantIdpowerShellErrorActionPreference: 'stop'# OptionalfailOnStandardError: false # OptionalazurePowerShellVersion: 'latestVersion'# Required. Options: latestVersion, otherVersionpwsh: false # Optional. If true, then will use PowerShell Core pwsh.exe
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Thanks for an awesome tool! I did spend way too long though trying to get the tool to connect to the databricks instance via Devops. Now that it's working, I thought I might share it in the docs.
For the below use case, I wanted to deploy any code from a folder (
/azure/databricks
) onmain
branch to a folder (/repo
) on my databricks instance, and refresh that folder on any change tomain
. It uses MANAGED connection type, so Databricks will automatically provision the service user - vital for real CI across multiple environments.The below assumes you have an Azure Resource Manger connection set up in Devops that points to the Azure resource group, and that you have granted the ARM Devops Service Principal 'Contributor' level permissions in Databricks.
Where should I stick this example in the repo?
The text was updated successfully, but these errors were encountered: