Skip to content

Commit

Permalink
Merge pull request #3 from newrelic-experimental/fix/docs
Browse files Browse the repository at this point in the history
chore(docs): config file refs and APM config
  • Loading branch information
jbeveland27 authored May 22, 2024
2 parents 119342f + 3c2d4c4 commit 31d6b87
Show file tree
Hide file tree
Showing 2 changed files with 25 additions and 6 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,12 +80,12 @@ Based on the cloud Databricks is hosted on, you will be able to run the APM agen

# Create new relic yml file
echo "common: &default_settings
license_key: 'xxxxxx' # Replace with your License Key
agent_enabled: true
license_key: 'xxxxxx' # Replace with your License Key
agent_enabled: true
production:
<<: *default_settings
app_name: Databricks" > ${NR_CONFIG_FILE}
<<: *default_settings
app_name: Databricks" > ${NR_CONFIG_FILE}
```

2. **Add the script to your Databricks cluster:** To add the initialization script to your cluster in Databricks, follow these steps:
Expand Down
23 changes: 21 additions & 2 deletions configs/example_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,29 @@ nr_api_key: xxxxx
nr_endpoint: US

# Databricks Credentials
# https://docs.databricks.com/en/dev-tools/auth/pat.html#databricks-personal-access-tokens-for-workspace-users
db_access_token: xxxxx

# Spark Driver Proxy API Endpoint
# Ref: https://databrickslabs.github.io/overwatch/assets/_index/realtime_helpers.html
# Run a Scala notebook to get this, use the resultant `url`
# val env = dbutils.notebook.getContext.apiUrl.get
# val token = dbutils.notebook.getContext.apiToken.get
# val clusterId = spark.conf.get("spark.databricks.clusterUsageTags.clusterId")
# val sparkContextID = spark.conf.get("spark.databricks.sparkContextId")
# val orgId = spark.conf.get("spark.databricks.clusterUsageTags.clusterOwnerOrgId")
# val uiPort = spark.conf.get("spark.ui.port")
# val apiPath = "applications"
# val url = s"$env/driver-proxy-api/o/$orgId/$clusterId/$uiPort"
spark_endpoint: xxxxx
databricks_endpoint: xxxxx

# Your databricks workspace URL
# https://docs.databricks.com/en/dev-tools/auth/index.html#databricks-account-and-workspace-rest-apis
databricks_endpoint: xxxxx #

# Azure | AWS | GCP
db_cloud: xxx

# Exporter batching and timing
batch_size: 50
harvest_time: 10
harvest_time: 10

0 comments on commit 31d6b87

Please sign in to comment.