Navigating the Azure Data Factory v2 Enhancements

Published 12/12/2018 08:35 AM   |    Updated 01/08/2019 11:33 AM
Azure Data Factory (ADF) was originally released as an Azure platform service in the cloud environment in 2015 — the same year it became generally available to end users. The service was released to be the leading resource for all data orchestration activities in the cloud.
 
Whether the requirement is to simply copy data from source to destination or kick off a Data Lake Analytics transformation job, ADF is the answer. The ADF service is a fully managed cloud service built for complex data hybrid Extract, Transform, Load (ETL), Extract, Load, Transform (ELT) and data integration processing.
 
At the end of 2017, Azure publicly released ADF version 2 (v2), which introduced various enhancements and incorporated customer feedback through the initial version release. While still in public preview, it’s open for all to test the functionality and changes from ADF v1. Let’s walk through some of the key enhancements — including SQL Server Integration Services (SSIS) capabilities (finally!).
 

Capabilities introduced in ADF v2

 
The overall concept of data sets, activities and pipelines remains intact within v2. However, the new version brings a few changes:
 
  • Control flow concepts
    Common ETL concepts such as chaining activities, branching activities, pipeline parameterization, custom state passing, looping containers, delta workflows, get metadata, web activity and many more. Let’s dig deeper on a few of these concepts:
    • Chaining activities: The introduction of a new activity property, dependsOn, allows the user to set the value equal to the name of the dependent activity.
    • Branching activities: Similar to programming languages, the new If-condition property allows the user to evaluate a Boolean expression to determine downstream activity processing based on an output of ‘True’ or ‘False.’
    • Looping containers: The ForEach activity is a repeating control flow in the pipeline that will iterate over specified collection of activities in a loop. Similarly, there’s an Until activity that loops until the provided condition equals ‘True.’
    • Delta workflows: Common in ETL processing, delta loads are implemented to only read in data that has changed since the last execution. ADF v2 includes a lookup activity, which allows for the natural implementation of delta loads.
  • SSIS functionality
    With this much anticipated addition, users can “lift and shift” SSIS solutions from on-premise to the cloud with ADF v2. The service allows the user to spin up an Azure-SSIS Integration Runtime — essentially a fully managed cluster of Virtual Machines (VMs) dedicated to running the packages.
    SSISDB can be hosted on Platform as a Service (PaaS) instance of Azure SQL Database to orchestrate project deployments. Within the ADF authoring visual tool, a pipeline can be created that contains a stored procedure task, which executes packages in SSISDB. The pipeline can then be tied to a trigger, which will schedule it for all future executions.
  • Flexible scheduling
    Triggers contain properties, which determine when pipelines need to be kicked off and executed. There are two types of triggers used in ADF v2:
    • Schedule trigger: triggers based on wall-clock schedules
    • Tumbling window trigger: triggers that operate on a periodic interval while retaining state
  • Visual authoring & monitoring
    In early 2018, based on customer feedback, Microsoft released rich interactive visuals to the authoring and monitoring of ADF pipelines. This allows users to publish pipelines without writing a single line of code. Integration with Visual Studio Team Services Git for source control allows for full transparency.
 

Version differences

 
The following table introduces high-level differences between the two ADF services.
 
One of the larger changes is the transfer from the concept of time slices and data set availability to a more traditional ETL approach scheduling process. Instead of waiting for a data set to become available for an activity when a pipeline is executing, the pipeline itself is triggered and kicks off the activity regardless of the state of the data set.
 
The Integration Runtimes (IR) are the compute infrastructure used by ADF v2 for data movement, activity execution and SSIS package executions. The IR provides the bridge between the linked services referenced in the activity and the activity itself. The IR is referenced by the linked service, which then provides the compute environment where the activity will be run in the nearest region to provide the most efficient performance based on the target data store.
 
  • Azure IR instances, as mentioned in the table above, can perform activities between cloud data stores only. Azure IR is a fully managed, serverless compute in Azure.
  • Self-hosted IR can perform activities between cloud data and private network data stores. This can be installed on premises or in a virtual private network.
  • Azure-SSIS IR instances are built specifically for executing SSIS packages. If the IR is installed in the public cloud, to read on-premise data, the IR must join a Virtual Network (VNet) that connects to on-premise data.
 

The future of Azure Data Factory

 
The introduction of native SSIS capabilities in ADF v2 was a key addition for the cloud data orchestration service. It provides a stepping stone for customers to get off their on-premise servers and move to a cloud-first strategy rather than completely re-architecting their existing data integration process from SSIS to ADF v1.
 
Along with the SSIS integration, many other features, such as control flow tasks and triggers, allow for greater flexibility in pipeline executions. As the new ADF v2 service approaches general availability and is no longer in public preview, users can submit feedback to Microsoft for further enhancements to the service.
 

Explore more Digital Innovation insights →

 
This article originally appeared on May 10, 2018.

Is this answer helpful?