Intro
Magic ETL on Snowflake uses the Snowflake-Cloud Amplifier integration to give Snowflake users the power of Magic ETL transforms without moving data or requiring assistance from data engineers. While keeping your data secure inside your existing data structure, Magic ETL on Snowflake makes self-serve transformation possible for low-code users and reduces the burden on engineering resources.Note: This feature is in beta. To enable it, contact your Domo account team. Magic ETL on Snowflake is only available to customers on the Domo Consumption agreement.
Prerequisites
To use this feature, complete the items below in Domo and Snowflake.Domo Requirements
Complete the following in Domo:- Configure write capability for Snowflake with Cloud Amplifier.
- For at least one Snowflake warehouse, choose Transform as the operation.
-
This MUST be on the same Cloud Amplifier integration configured with write capability.

-
This MUST be on the same Cloud Amplifier integration configured with write capability.
- All input DataSets must be part of the same Snowflake-Cloud Amplifier integration. A Magic ETL DataFlow with inputs from the Domo cloud cannot execute on the Snowflake cloud.
- Inputs from more than one Snowflake cloud are not supported.
Snowflake Requirements
Complete the following in Snowflake:- The Snowflake role of the service account used in the Snowflake-Cloud Amplifier integration needs SELECT, MODIFY, USAGE, and CREATE privileges on the tables you are referencing as inputs. This should be addressed by the commands run when you configure write capability.
Required Grant
To use Magic ETL on Snowflake, users must have the following grant enabled for their role:- Edit Adrenaline DataFlow — Allows users to create, edit, and run Adrenaline DataFlows to which they have access.
Access Magic ETL on Snowflake
You can use this feature in the Magic ETL interface after completing a Snowflake-Cloud Amplifier transform integration. Access Magic ETL from the details page of any DataSet by selecting Open With > Magic ETL.
Use Magic ETL on Snowflake
Magic ETL on Snowflake allows you to use the Magic ETL interface to define a data transformation, but the transformation executes on Snowflake. When you configure a Snowflake-Cloud Amplifier integration to allow transform operations, a dropdown displays in the Magic ETL editor, where you can choose where the ETL executes. The dropdown provides options for each of the available Snowflake clouds.
FAQ
Can I configure Magic ETL DataFlow to use the Append method when it executes on Snowflake?
Can I configure Magic ETL DataFlow to use the Append method when it executes on Snowflake?
Yes. The default method is Replace, but you can change it to Append.
Is subset processing supported?
Is subset processing supported?
Yes, subset processing is supported by leveraging Snowflake Dynamic tables. Learn more about subset processing.
Can Magic ETL use existing partitions on my Snowflake table when running a DataFlow?
Can Magic ETL use existing partitions on my Snowflake table when running a DataFlow?
Yes. Magic ETL can leverage optimizations like cluster keys and cluster tables at execution time without additional configuration. Learn about configuring these items
in Snowflake documentation.
in Snowflake documentation.Can I limit which users in Domo execute Magic ETL DataFlows on Snowflake?
Can I limit which users in Domo execute Magic ETL DataFlows on Snowflake?
Yes. Only Domo users with the Edit Adrenaline DataFlow grant can execute DataFlows.
Can I take a Magic ETL DataFlow originally configured to run in Magic ETL on Domo and convert it to execute on my Snowflake cloud?
Can I take a Magic ETL DataFlow originally configured to run in Magic ETL on Domo and convert it to execute on my Snowflake cloud?
Yes — if all inputs for the DataFlow are in that Snowflake cloud and if the DataFlow uses the Replace or Append update method. If the DataFlow was configured with the Upsert or Partition method, it will error. Learn about DataSet update methods.
Are the Python and R scripting tiles supported?
Are the Python and R scripting tiles supported?
No. The Python and R scripting tiles are not supported for Magic ETL on Snowflake.
Am I charged Domo credits for a Magic ETL DataFlow executions on Snowflake?
Am I charged Domo credits for a Magic ETL DataFlow executions on Snowflake?
Yes. Please refer to your sales agreement for credit consumption terms.
Next steps: Learn how to create a Magic ETL DataFlow or learn about the action tiles available in the Magic ETL drag-and-drop interface.