Azure Databricks and Azure Key Vault

The key vault should always be a core component of your Azure design because we can store keys, secrets, certicates thus abstract / hide the true connection string within files. When working with databricks to mount storage to ingest your data and query it ideally you should be leveraging this to create secrets and secret scopes.

So, your code to mount would be something like: notice the scope and key. 

dbutils.fs.mount(

source = “wasbs://yourdrive@waterfakelake.blob.core.windows.net”,

mount_point = “/mnt/sensedata”,

extra_configs = {“fs.azure.account.key.clusterlake.blob.core.windows.net”:dbutils.secrets.get(scope = “akssecretkey”, key = “lakeclusterstore”)})

You need to do setup work in both databricks and the key vault. I will try lay this out simply. Before you begin grab a storage key, this is where the data is that you need to mount to databricks. Save it. (And the account name).

Go to the key vault and add secret.

Using the below table to complete the secret details.

Once created find the secret ID, DNS ID and resource ID. Next you need to enter into a specific URL based on your own databricks environment.  (Fake URL below but you need the #secrets/createScope appended)

https://adb-mistermister.azuredatabricks.net/?o=999999998#secrets/createScope

You will enter in a screen which you will need to complete based on the information you have gathered thus far.

Once complete that original code at the start of the blog post will work and output will confirm a successful execution.

Advertisement

2 thoughts on “Azure Databricks and Azure Key Vault

  1. Pingback: Using Key Vault in Azure Databricks – Curated SQL

  2. Pingback: Dew Drop – October 29, 2020 (#3307) | Morning Dew

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s