Skip to content

Latest commit

 

History

History
37 lines (24 loc) · 1.92 KB

File metadata and controls

37 lines (24 loc) · 1.92 KB

Azure Data Factory Pipelines

This contains the ARM template definitions to deploy Azure Data Factory pipeline which copy data from SQL DB to Azure Storage (Delta Lake).

Prerequisites

Setup and Deployment

  1. Ensure you are in e2e_samples/dataset-versioning/datafactory/.
  2. Run the arm_deploy_script output from Terraform after you provisioned Azure Resources by IaC (Terraform). It will look like below:
az deployment group create --name {Your deployment name} --resource-group {Your resource group name} --template-file ./arm_template/arm_template.json --parameters factoryName="{Your data factory name}" KeyVault_properties_typeProperties_baseUrl="{Your key vault url}" AzureBlobFS_properties_typeProperties_serviceEndpoint="{Your blob storage url}"

Example of a populated command:

az deployment group create --name arm_deploy --resource-group rg-masatf2 --template-file ./arm_template/arm_template.json --parameters factoryName='adf-masatfapp-dev' KeyVault_properties_typeProperties_baseUrl='https://kv-masatfapp-dev-eastus.vault.azure.net/' AzureBlobFS_properties_typeProperties_serviceEndpoint='https://dlsmasatfappdev.blob.core.windows.net/'

Parameters in deployment script

Name Description
factoryName Azure Data Factory name where you'll deploy ARM template into
KeyVault_properties_typeProperties_baseUrl Your key vault url
AzureBlobFS_properties_typeProperties_serviceEndpoint Azure Blob Storage endpoint (url)

See here for more information about deploying ARM templates.

Next step

Running the sample: Load data into data source (Azure SQL Database)