ggk-quote

Let's Connect

ggk-quote

Let's Connect

ggk-contact

+91 1234 44 4444

Blog

AZURE Data Factory

December 15, 2015

by ACS Solutions

The data landscape for enterprises is rising exponentially in volume, variety, and complexity. Unstructured, semi-structured and cloud-borne data need to be processed through various tools, frameworks, and custom services.

Azure Data Factory (ADF) is a data integration service based on the cloud. It orchestrates and automates the movement and transformation of data. ADF is a part of Microsoft’s analytics suite and is not a standalone ETL (Extract Transform Load) tool. Composing, orchestrating and monitoring is done for diverse data of various origins which give ADF a diversified data collector stature.

Azure Data Factory works across on-premise, cloud data sources as well as SaaS to assimilate, prepare, transform, analyze, and finally publish the data. Like a manufacturing factory that runs various equipment to ingest the raw materials and convert them into finished goods, ADF conveniently collects raw data from diverse sources and transforms them into ready-to-use information.

In addition to the above functionalities, Azure Data Factory also provides a dynamic monitoring dashboard where the users can monitor the data pipelines.

Azure Data Factory Concepts

Dataset: Datasets are named references/pointers to the data you want to use as an input or an output of an Activity. E.g. table, files.

Activity: Activities define the actions to perform on your data. E.g. HIVE, copy.

Pipeline: Pipelines are a logical grouping of Activities. E.g. manage, monitor, schedule.

Linked service: Linked services define the required information for Data Factory to connect to various external resources (Connection Strings). E.g. SQL server, Hadoop cluster.

Developers can also create Data Factory with Azure Portal, PowerShell (using Azure Resource Manager templates), Visual Studio (with Azure .NET SDK 2.7 or later), REST APIs – Azure Data Factory SDK. Users can author activities, combine them into a pipeline and set an execution schedule.

ADF Can Come Handy in the Following Scenarios

  • Access to data sources such as SQL Server On-premise, SQL Azure, Azure Blob Storage, etc.
  • Data transformation through Hive, Pig, C#, etc.
  • Monitoring pipelines of data, validation, and execution of scheduled jobs and more.
  • Loading into desire destinations such as local SQL servers, SQL Azure or Azure Blob Storage, etc.

Prerequisites to Use ADF

  • Microsoft Data Management Gateway
  • To sign up, credit/debit card information is needed. A free trial of 30 days is available.

It’s a common notion that Azure is here to replace SSIS but ADF comes as a complimentary service and not to compete with SSIS.

SSISADF
Development ToolSSDT/BIDSPortal, ADF Editor, PowerShell, JSON Script
Data Source and DestinationsMany sources and destinationsAZURE Storage, ASURE SQL database, SQL Server, Filesystem
Data TransformationsMany transformationsFewer activities. Need to write own activities.
EnvironmentAdministrative efforts need good software hardwareAzure will take care of everything
PricingPay for futuresPay per Usage
Error handlingError handling through event handlers, the failure precedence constraintAlert Rules, no event handlers, error message logging
DeploymentDeployment wizardPowerShell Scripts
MonitoringSSIS logging and catalogue reportsPowerful GUI, DataSlice execution, drill through monitor feature
Data LineageNoneAvailable
SecurityRole-based for deploy, execute and monitorRoles such as owner, contributor, reader, data factory contributor, and user access administrator.

 

Moving On-premises Files to Azure SQL Database

Steps to copy .csv file data to cloud databases.

  1. Place the Source file (.csv) file on the local machine.
  2. Create SQL Database in Azure portal.
  3. Create a destination table for the above-created database using SSMS on your local machine.
  4. Create Data Factory in the Azure portal for the required transfer:
    • Create a gateway to access on-premises resources (File, Database).
    • Create Linked service for Source (File) and Destination (Azure table).
    • Create Datasets for Source and Destination.
    • Create a pipeline to move data from Source to Destination.
  1. Monitor and Manage – for each data set we can see the status of the row (whether it is transferred or not, failure logs if any, re-run option if required).

Azure data factory is highly available, fault-tolerant, and a fully managed service. Additionally, features like automatic cluster management, retrying for transient failures, configurable timeout policies and alerts make it a very convenient tool to process data in a managed environment.