Intro
Magic ETL on BigQuery allows you to execute your BigQuery DataSets using Domo data transforms and write results back to BigQuery without moving data or creating complex configurations.Prerequisites
To use this feature, you need a Domo Cloud Amplifier integration configured for both “read” and “write” capability with your BigQuery account. Learn more about integrating BigQuery using Cloud Amplifier.Required Grant
To use Magic ETL on BigQuery, users must have the following grant enabled for their role:- Edit Adrenaline DataFlow — Allows users to create, edit, and run Adrenaline DataFlows to which they have access.
Access Magic ETL on BigQuery
You can use this feature in the Magic ETL interface after completing your BigQuery-Cloud Amplifier transform integration. Access Magic ETL from the details page of any DataSet by selecting Open With > Magic ETL.
Use Magic ETL on BigQuery
Magic ETL on BigQuery allows you to use the Magic ETL interface to define a data transformation, but the transformation executes on BigQuery. When you configure a BigQuery-Cloud Amplifier integration to allow transform operations, a dropdown displays in the Magic ETL editor, where you can choose where the ETL executes. The dropdown provides options for each of the available BigQuery DataSets. Follow these steps to define and execute a data transformation:-
Find your BigQuery integration using the connection name. You can find the connection by navigating to Data > Warehouse.

-
Select Transform Data > Magic ETL.

-
Use Compute to select your BigQuery integration. This action updates the Magic ETL background to indicate that you are now using Magic ETL on BigQuery.

-
Select and drag an Input DataSet tile and the DataSet that is connected to the selected BigQuery integration.

- Finish creating your DataFlow. Every time your new DataFlow runs, the execution happens in BigQuery.
FAQ
Can I set my Magic ETL DataFlow to use append when it executes on BigQuery?
Can I set my Magic ETL DataFlow to use append when it executes on BigQuery?
Can I set my Magic ETL DataFlow to use upsert or partitioning when it executes on BigQuery?
Can I set my Magic ETL DataFlow to use upsert or partitioning when it executes on BigQuery?
Can my Magic ETL use existing partitions on my BigQuery table when running an ETL?
Can my Magic ETL use existing partitions on my BigQuery table when running an ETL?
Do I have to configure any special permissions in BigQuery to be able to use Magic ETL?
Do I have to configure any special permissions in BigQuery to be able to use Magic ETL?
Can I limit which users in Domo can execute Magic ETL on BigQuery?
Can I limit which users in Domo can execute Magic ETL on BigQuery?
Do I need to set up a default storage cloud?
Do I need to set up a default storage cloud?
Can I take a Magic ETL DataFlow that was previously configured to execute on Magic ETL and swap it over to execute on my BigQuery cloud (assuming all inputs are on the BigQuery cloud)?
Can I take a Magic ETL DataFlow that was previously configured to execute on Magic ETL and swap it over to execute on my BigQuery cloud (assuming all inputs are on the BigQuery cloud)?
Is the Python tile supported?
Is the Python tile supported?
Is the R Scripting tile supported?
Is the R Scripting tile supported?
Am I charged Domo credits for a Magic ETL DataFlow that executes on BigQuery?
Am I charged Domo credits for a Magic ETL DataFlow that executes on BigQuery?