A nice feature with Azure Data Factory is the ability to copy multiple tables with a minimum of coding. To do this we can use a lookup, a for each loop, and a copy task.
To make this sample work you need to create all the tables you want to copy in the sink database. And make sure that you can insert values to all of the columns.
If you have primary key columns with auto-increment it needs to be changed. If you have foreign keys in your tables these need to be dropped. And all computed columns must be changed
In my lookup, I will use this code to list all the tables and schema names in the AdventureWorksLT database.
SELECT '[' + TABLE_SCHEMA + '].[' + TABLE_NAME + ']' As MyTableWithSchema , TABLE_SCHEMA As MySchema, TABLE_NAME As MyTable FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE'
This gives me a list of tables and schema names inside the database
Next, I will add the foreach to loop through all my table names
Add the following code: @activity(‘Get-Tables’).output.value
Then double click on your foreach task and add a copy task.
Add the following query: SELECT * FROM @{item().MyTableWithSchema}
If you want to truncate the tables before you load them you could add this script to the “Pre-copy script”: truncate table @{item().MyTableWithSchema}
Open your sink dataset and add two parameters
Configure the dataset to use these new parameters
Go back to your copy data task
Assign the values from the foreach task to the parameters. Save and publish. Execute the job.
If it executes successfully the output would look like this: