Skip to main content

Intro

If you have integrated your cloud data warehouse with Domo, you can use Magic ETL to define a data transformation that executes directly on your warehouse data. This enhances the flexibility in your environment and prevents you from having to refactor existing SQL to use in Magic ETL. Whether you have existing SQL code or prefer writing SQL to dragging and dropping tiles (as in the Magic ETL interface), Pass-Through SQL in the Magic ETL SQL tile allows you to use native warehouse SQL syntax to transform your data in ways specific to your use cases. Learn more about the SQL tile. Which cloud data warehouses are supported?
Tip: Before using this feature, read more about the following:


Required Grants

In Domo, you need to have the following grants enabled to use Pass-Through SQL:
  • Edit Adrenaline DataFlow — You must have this grant through your system or custom role. Learn how to add grants to a custom role.
  • One of the following:
    • (Admin-level) Manage Cloud Accounts — This grant is considered admin-level because it gives the holder access to instance-wide controls.
    • Edit DataFlow This option is available only if Integration Sharing is enabled. If using the Edit DataFlow grant, the holder must also have Can Write permission to the integration account. Learn how to share an integration.
      Important: To use Pass-Through SQL, your cloud data warehouse integration with Domo must be configured for WRITE and TRANSFORM. Choose a supported cloud data warehouse from the list above to learn how to configure your integration.

Access Pass-Through SQL

Pass-Through SQL is available while using cloud compute for Magic ETL jobs on Snowflake. When an integration is selected as the Compute Cloud in Magic ETL, the SQL tile allows you to choose the data warehouse as the SQL dialect. Learn more about the SQL tile.

Use Pass-Through SQL in Magic ETL

This example describes how to use Pass-Through SQL with Snowflake. The process is the same with other integrations, such as Databricks.
  1. After opening Magic ETL, choose a compute engine from the Compute dropdown menu. In this example, we are using a Snowflake engine, since we want to execute the transform on a Snowflake instance.
    Important: If you change the compute engine after the DataFlow has executed, you must remove the outputs and re-create them, and select a new compute location. You must also change the inputs if they aren’t available on the new cloud.
  1. Drag a SQL tile (in the left panel under Utility) onto the canvas to open the tile configuration panel.
  2. At the bottom of the configuration panel, select Snowflake from the SQL Dialect menu.
    A pop-up displays notes and cautions—writing SQL in this tile executes it directly on your Snowflake instance. You can temporarily or permanently dismiss the pop-up.
  3. Use the Code tab of the configuration panel to write your SQL.
    Keep in mind:
    • You can use any SQL SELECT statement syntax and features supported by your warehouse.
    • The queries will run with your data warehouse credentials and permissions.
    • All queries execute directly against your live warehouse environment.
Note: Pass-Through SQL includes autocomplete features. This means Domo loads a list of all the warehouse databases, schemas, and tables that the account has access to for a smoother authoring experience.

FAQ

No, Pass-Through SQL is limited to query functions (SELECT).
Only the integration account owner can save a DataFlow with Pass-Through SQL enabled. Alternatively, if integration sharing is enabled, only users with the Edit DataFlow grant and Can Write permission to that integration can use this capability.