Intro
With Cloud Amplifier, you can power your Domo instance using Databricks. This guide is written for users who are familiar with Databricks and describes how to register Databricks with Cloud Amplifier for read-only and read/write integrations. After completing the setup processes of your choice, you can use virtual tables that read from Databricks to create cards, configure alerts, and serve as inputs in Magic ETL DataFlows.Architectural Overview

Prerequisites
Before configuring the Databricks connection, we strongly recommend that you complete the following:- (Recommended) Create a Databricks service account — We recommend creating a new Databricks account specifically for this integration. You can use any account with Read access in Databricks, but it is best practice to use a service account. This account must have read access to your default Databricks environment in order to create virtual Databricks tables in Domo.
- (Recommended) Create a Domo service account — We recommend creating a new Domo account specifically for this integration. The custom role for the account must have the Manage Cloud Accounts and Manage DataSet grants enabled. For more information about using OAuth with your Databricks-Cloud Amplifier connection, see DataBricks using Personal Access Token.
Read-Only Setup
Follow these steps to configure read-only access to Databricks tables from within Domo via Cloud Amplifier.- Log in to your Domo service account.
-
Go to the
Data Warehouse.
-
Select
Add New Cloud Account.
The new integration modal displays.
-
In the modal under
Native integration, select Databricks (If Databricks doesn’t display, select See More to see all possible integrations.)

-
Select
+ Add New Integration. If there are other existing integrations, this option displays at the bottom of the list.

-
Fill out the fields in the modal. You can move through the modal by selecting Next on each screen.
- Enter a name and optional description in the Integration name and Integration description fields.
- Create a personal access token in your Databricks account. Copy the token for your records—you cannot recover it after you leave the page.
- Paste the token into the Databricks personal access token field in the integration modal.
-
Locate the Databricks connection URL in Databricks. (Navigate to
SQL Warehouses > { warehouse name } > Connection details in Databricks.)
Note: Do not include protocol identifiers with the URL. jdbc:databricks:// is assumed, jdbc:spark:// is not supported.
-
Copy the JDBC URL and paste it into the Databricks connection URL field in the integration modal.

- Configure your data freshness settings. Learn about advanced scheduling for data freshness.
Add Databricks Tables to Domo
The following process is optional.-
After setting up your read-only integration, select
Choose Tables to Connect.

- Search for and select Databricks schemas and tables you want to use to create DataSets in Domo.
-
Select
Create DataSets.

Write Setup
Before taking the steps described in this section, complete the following prerequisites.Prerequisites
- Enable Unity Catalog on your Databricks instance. Find enablement instructions
here
. - Obtain the following Databricks permissions for your Databricks user:
- Create storage credential
- Create catalog
- Create external location
Convert a Read-Only Integration to Read-Write
-
In your Domo instance, go to the
Data Warehouse and select Manage Cloud Connections to open the integration modal.

- In the modal under Native integration, select See more > Databricks. The list of existing Databricks integrations displays.
-
Locate the existing read-only integration that you want to upgrade to a read-write integration. Hover over the integration row and select
Options (wrench icon) > Configure Write Access.
Use Next to move through the modal as you complete the steps.

-
In the modal, enter the Databricks write catalog name in the labeled field.

-
Select
Validate to check the name of the catalog and load the list of schemas that belong to that catalog.

-
After validating the catalog, choose the schema from the
Schema dropdown.

-
Follow the on-screen instructions to create a storage credential and enter it in the
STORAGE_AWS_EXTERNAL_ID field.

-
Follow the on-screen instructions to create
External Locations.

-
On the
Finalize Write Integration screen, confirm that you understand that Domo will be able to make changes to your Databricks environment.
A success message displays.

-
Choose
Return To Account List to see your integration configured with read and write access.
