Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
SlideShare a Scribd company logo
Deep Dive into Azure Data
Factory v2
Eric Bragas
• Senior Business Intelligence
Consultant with DesignMind
• Always had a passion for art, design,
and clean engineering (aka. I own a
Dyson vacuum)
• Undergoing my Accelerated Freefall
training to become a certified
skydiver
• And I often overuse parentheses
(and commas).
https://www.linkedin.com/in/ericbragas93/
@ericbragas
eric@designmind.com
Agenda
• Overview
• Azure Data Factory v2
• ADF and SSIS
• Components
• Expressions, Functions, Parameters, and System Variables
• Development
• Monitoring and Management
• Q&A
Overview
What is Azure Data Factory v2?
Overview
• "[Azure Data Factory] is a cloud-
based data integration service
that allows you to create data-
driven workflows in the cloud
that orchestrate and automate
data movement and data
transformation.“
• Version 1 – service for batch
processing of time series data
• Version 2 – a general purpose
data processing and workflow
orchestration tool
Deep Dive into Azure Data Factory v2
Comparison
ADFv2
ADFv1SSIS
ETL vs. ELT
• Key difference is where the transformations are processed
• ETL – transforms are processed by the integration tool (i.e. SSIS)
• ELT – transforms are processed by the target database (i.e. Data Lake, SQL, etc.)
• Main benefit is scalability to larger data volumes
• Main drawback is the added step between source and destination
• This isn’t always a drawback when you are feeding multiple sources from the same
pool of raw data
My preference is ELT, even in non-big data scenarios because a database
engine can typically perform asynchronous transformations faster than SSIS
Deep Dive into Azure Data Factory v2
Version 1 vs. Version 2
Version 1:
• Time-series based
• Schedules driven by dataset
availability
• Developed using Visual Studio
• Pretty cool
Version 2:
• General purpose
• Explicit and Tumbling-window
scheduling
• Freaking awesome
Version 1 vs. Version 2
Component Changes
Datasets No longer use the "availability" property
Linked Services Includes the new "connectVia" property which allows
selection of the Integration Runtime to use (see
Integration Runtime section)
Pipelines Unit of scheduling instead of activities
Activities Control and non-control activity types
Dependencies between activities
Triggers New scheduling component
Integration Runtime Replacement for the Data Management Gateway
Version 2 vs. SSIS
• Pipelines ~= Packages
• Can use similar master-child patterns
• Linked Services ~= Connection Managers
• SSIS usually extracts, transforms, and loads data all as a single
process. ADF leverages external compute services to do
transformation.
Can also deploy and trigger SSIS packages to ADFv2 using the Azure-
SSIS Integration Runtime
Sample of Supported Sources/Sinks
Components
Building Blocks
Deep Dive into Azure Data Factory v2
Linked Services
• A saved connection string to a data storage or compute service
• Doesn’t specify anything about the data itself, just the means of
connecting to it
• Referenced by Datasets
Dataset
• A data structure within a storage linked service
• Think: SQL table, blob file or folder, HTTP endpoint, etc.
• Can be read from and written to by Activities
Activity
• A component within a Pipeline that performs a single operation
• Control and Non-control activities
• Copy
• Lookup
• Web Request
• Execute U-SQL Job
• Can be linked together via dependencies
• On Success
• On Failure
• On Completion
• On Skip
Pipeline
• Pipelines are the containers for a series of activities that makes up a
workflow
• Started via a trigger, accept parameters, and maintain system
variables such as the @pipeline().RunId
Triggers
• Schedules that trigger pipeline execution
• More than one pipeline can subscribe to a single trigger
• Explicit schedule - i.e. every Monday at 3 AM, or…
• Tumbling window - i.e. every 6 hours starting today at 6 AM
Integration Runtime
An activity defines the action to be performed.
A linked service defines the target data store or compute service.
An integration runtime provides the bridge between the two.
• Data Movement: between public and private data stores, on-premise
networks, supports built-in connectors, format conversion, column
mapping, etc.
• Activity Dispatch: dispatch and monitor transformation activities to
services such as: SQL Server, HDInsight, AML, etc.
• SSIS Package Execution: natively execute SSIS packages.
Integration Runtime (cont’d)
Types of IR:
• Azure (default)
• Self-Hosted
• Azure-SSIS
Demo
Create Copy Pipeline
Expressions, Functions,
Parameters, and System Variables
Oh my!
Expressions
• Syntax evaluated during execution of an activity that allows for dynamic
changes to the property configurations they are used within
• Reference things such as parameters, the output of previous activities, and
provide access to the current item being iterated over by loops
@pipeline().parameters.myParam
@formatDateTime(item().value.myDateAttr, ‘yyyy-MM-DD’)
Custom State Passing
• Custom State Passing refers to the ability for a downstream activity to
access the output of an upstream activity
• Expressions can be used to access these output states and change
configuration of the currently executing activity
@activity(‘myUpstreamActivity’).output.rowsRead
Functions
• String – string manipulation
• Collection – operate over arrays, strings, and sometimes dictionaries
• Logic – conditions
• Conversion – convert between native types
• Math – can be used on integer or float
• Date
There is not currently a way to add or define additional functions
Parameters
• Key-value pairs that can be passed to a pipeline when it is started or a dataset
when it is used by an activity
• Need to first be configured to receive a parameter with a specific name and data
type before the calling component can be configured to pass a value.
Two types of Parameters:
• Pipeline
• Dataset
@pipeline().parameters.myParam
@dataset().myParam
System Variables
• System Variables are read-only values that are managed by the Data
Factory and provide metadata to the current execution
• These can be used for custom logging or within expressions
• They can be either Pipeline-scoped or Trigger-scoped
@pipeline().DataFactory
@trigger().scheduledTime
ADFv2 Development
Tools and Techniques
Design Patterns
• Delta/Incremental Loading
• Dynamic table loading
• Custom logging
• Using database change tracking
Monitoring and Management
Tools and Techniques
Tools for Monitoring
• Azure Portal – Author and Monitor
• PowerShell
Deployment
• Use separate dev/test/prod resource groups and Data Factory
services
• Deploy to separate services using ARM Templates (until VS extension
available)
• Can also script deployments using PowerShell or Python SDK
Debugging
• Use monitor, drill into a pipeline and view error messages directly on
the activity
• Cannot see the result of an evaluated expression, so you may need to
be clever
• Depending on the error, you may get a message that is completely
useless. Good luck.
Deploying SSIS to Azure-SSIS Integration
Runtime
• Allows deployment and execution of native SSIS packages
• Use Azure SQL Database to host SSISDB Catalog
• Limitations exist with using the Azure SDK for SSIS
• Cannot execute U-SQL jobs
• Lift-and-shift option for existing SSIS packages
Contact Info
https://www.linkedin.com/in/ericbragas93/
@ericbragas
eric@designmind.com
https://github.com/ebragas
fin.

More Related Content

Deep Dive into Azure Data Factory v2

  • 1. Deep Dive into Azure Data Factory v2
  • 2. Eric Bragas • Senior Business Intelligence Consultant with DesignMind • Always had a passion for art, design, and clean engineering (aka. I own a Dyson vacuum) • Undergoing my Accelerated Freefall training to become a certified skydiver • And I often overuse parentheses (and commas). https://www.linkedin.com/in/ericbragas93/ @ericbragas eric@designmind.com
  • 3. Agenda • Overview • Azure Data Factory v2 • ADF and SSIS • Components • Expressions, Functions, Parameters, and System Variables • Development • Monitoring and Management • Q&A
  • 4. Overview What is Azure Data Factory v2?
  • 5. Overview • "[Azure Data Factory] is a cloud- based data integration service that allows you to create data- driven workflows in the cloud that orchestrate and automate data movement and data transformation.“ • Version 1 – service for batch processing of time series data • Version 2 – a general purpose data processing and workflow orchestration tool
  • 8. ETL vs. ELT • Key difference is where the transformations are processed • ETL – transforms are processed by the integration tool (i.e. SSIS) • ELT – transforms are processed by the target database (i.e. Data Lake, SQL, etc.) • Main benefit is scalability to larger data volumes • Main drawback is the added step between source and destination • This isn’t always a drawback when you are feeding multiple sources from the same pool of raw data My preference is ELT, even in non-big data scenarios because a database engine can typically perform asynchronous transformations faster than SSIS
  • 10. Version 1 vs. Version 2 Version 1: • Time-series based • Schedules driven by dataset availability • Developed using Visual Studio • Pretty cool Version 2: • General purpose • Explicit and Tumbling-window scheduling • Freaking awesome
  • 11. Version 1 vs. Version 2 Component Changes Datasets No longer use the "availability" property Linked Services Includes the new "connectVia" property which allows selection of the Integration Runtime to use (see Integration Runtime section) Pipelines Unit of scheduling instead of activities Activities Control and non-control activity types Dependencies between activities Triggers New scheduling component Integration Runtime Replacement for the Data Management Gateway
  • 12. Version 2 vs. SSIS • Pipelines ~= Packages • Can use similar master-child patterns • Linked Services ~= Connection Managers • SSIS usually extracts, transforms, and loads data all as a single process. ADF leverages external compute services to do transformation. Can also deploy and trigger SSIS packages to ADFv2 using the Azure- SSIS Integration Runtime
  • 13. Sample of Supported Sources/Sinks
  • 16. Linked Services • A saved connection string to a data storage or compute service • Doesn’t specify anything about the data itself, just the means of connecting to it • Referenced by Datasets
  • 17. Dataset • A data structure within a storage linked service • Think: SQL table, blob file or folder, HTTP endpoint, etc. • Can be read from and written to by Activities
  • 18. Activity • A component within a Pipeline that performs a single operation • Control and Non-control activities • Copy • Lookup • Web Request • Execute U-SQL Job • Can be linked together via dependencies • On Success • On Failure • On Completion • On Skip
  • 19. Pipeline • Pipelines are the containers for a series of activities that makes up a workflow • Started via a trigger, accept parameters, and maintain system variables such as the @pipeline().RunId
  • 20. Triggers • Schedules that trigger pipeline execution • More than one pipeline can subscribe to a single trigger • Explicit schedule - i.e. every Monday at 3 AM, or… • Tumbling window - i.e. every 6 hours starting today at 6 AM
  • 21. Integration Runtime An activity defines the action to be performed. A linked service defines the target data store or compute service. An integration runtime provides the bridge between the two. • Data Movement: between public and private data stores, on-premise networks, supports built-in connectors, format conversion, column mapping, etc. • Activity Dispatch: dispatch and monitor transformation activities to services such as: SQL Server, HDInsight, AML, etc. • SSIS Package Execution: natively execute SSIS packages.
  • 22. Integration Runtime (cont’d) Types of IR: • Azure (default) • Self-Hosted • Azure-SSIS
  • 24. Expressions, Functions, Parameters, and System Variables Oh my!
  • 25. Expressions • Syntax evaluated during execution of an activity that allows for dynamic changes to the property configurations they are used within • Reference things such as parameters, the output of previous activities, and provide access to the current item being iterated over by loops @pipeline().parameters.myParam @formatDateTime(item().value.myDateAttr, ‘yyyy-MM-DD’)
  • 26. Custom State Passing • Custom State Passing refers to the ability for a downstream activity to access the output of an upstream activity • Expressions can be used to access these output states and change configuration of the currently executing activity @activity(‘myUpstreamActivity’).output.rowsRead
  • 27. Functions • String – string manipulation • Collection – operate over arrays, strings, and sometimes dictionaries • Logic – conditions • Conversion – convert between native types • Math – can be used on integer or float • Date There is not currently a way to add or define additional functions
  • 28. Parameters • Key-value pairs that can be passed to a pipeline when it is started or a dataset when it is used by an activity • Need to first be configured to receive a parameter with a specific name and data type before the calling component can be configured to pass a value. Two types of Parameters: • Pipeline • Dataset @pipeline().parameters.myParam @dataset().myParam
  • 29. System Variables • System Variables are read-only values that are managed by the Data Factory and provide metadata to the current execution • These can be used for custom logging or within expressions • They can be either Pipeline-scoped or Trigger-scoped @pipeline().DataFactory @trigger().scheduledTime
  • 31. Design Patterns • Delta/Incremental Loading • Dynamic table loading • Custom logging • Using database change tracking
  • 33. Tools for Monitoring • Azure Portal – Author and Monitor • PowerShell
  • 34. Deployment • Use separate dev/test/prod resource groups and Data Factory services • Deploy to separate services using ARM Templates (until VS extension available) • Can also script deployments using PowerShell or Python SDK
  • 35. Debugging • Use monitor, drill into a pipeline and view error messages directly on the activity • Cannot see the result of an evaluated expression, so you may need to be clever • Depending on the error, you may get a message that is completely useless. Good luck.
  • 36. Deploying SSIS to Azure-SSIS Integration Runtime • Allows deployment and execution of native SSIS packages • Use Azure SQL Database to host SSISDB Catalog • Limitations exist with using the Azure SDK for SSIS • Cannot execute U-SQL jobs • Lift-and-shift option for existing SSIS packages
  • 38. fin.

Editor's Notes

  1. ADFv2 is more similar to SSIS than ADFv1. It’s a general purpose ELT tool with similar scheduling mechanisms Still supports time-series sources using tumbling window schedules
  2. http://www.jamesserra.com/archive/2012/01/difference-between-etl-and-elt/ Other reasons I prefer ELT: Set based code Performance tune T-SQL instead of packages Allocate more resources to database engine over SSIS Code looks cooler
  3. https://docs.microsoft.com/en-us/azure/data-factory/compare-versions
  4. https://docs.microsoft.com/en-us/azure/data-factory/compare-versions
  5. All supported services in v1: https://docs.microsoft.com/en-us/azure/data-factory/v1/data-factory-create-datasets Supported Services in v2: https://docs.microsoft.com/en-us/azure/data-factory/concepts-datasets-linked-services
  6. Previously in ADFv1, the Activity was the unit of scheduling as well as a unit of execution. This meant that activities had the ability to execute on different schedules within a given pipeline. This is no longer the case as the Pipeline is now the unit of scheduling and is kicked off by a trigger (or on-demand).
  7. Because of the change from a time-series based scheduling to explict scheduling, pipelines can no longer be setup to back-fill data using time-slices. i.e. Every day at noon starting 3 months ago. Now schedules can only start today and going forward. This means that any back-filling logic must be explicitly defined by the developer.
  8. Types of IR: Azure Self-Hosted Azure-SSIS
  9. https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime
  10. When debugging a pipeline, you can’t really see the result of an expression, so you have to be attentive to figure out how it’s resolving https://docs.microsoft.com/en-us/azure/data-factory/control-flow-expression-language-functions
  11. https://docs.microsoft.com/en-us/azure/data-factory/control-flow-system-variables
  12. https://docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-overview https://docs.microsoft.com/en-us/azure/data-factory/tutorial-incremental-copy-multiple-tables-portal