Keep Calm and Study On - Unlock Your Success - Use #TOGETHER for 30% discount at Checkout

Azure Data Factory Online Course

Azure Data Factory Online Course


Building frameworks has become an industry standard, and mastering the ability to visualize, design, and implement data frameworks is a crucial skill. In this course, we’ll focus on building a Metadata-Driven Ingestion Framework, which allows businesses to develop a reusable system that can be adopted across different business units, saving both time and costs. The course starts by introducing you to Azure Data Factory, guiding you through the creation of your first pipeline. Once you're comfortable with the basics, we’ll move to building a dynamic metadata-driven framework for efficient ingestion and operational monitoring. By the end of the course, you'll be equipped to design, implement, and deploy production-ready data ingestion solutions using Azure Data Factory.


Key Benefits

  • A beginner-friendly, in-depth course focused on designing and implementing Azure Data pipeline ingestion, suitable for those new to the platform as well as experienced professionals seeking to expand their skills.
  • Provides industry-relevant insights and practical tips, ensuring that you gain the knowledge required to develop production-ready data ingestion solutions for real-world Azure projects.
  • Combines hands-on, practical learning with theoretical concepts, presented through interactive animations to enhance understanding and engagement.


Target Audience

This course is designed for aspiring data engineers and developers interested in exploring Azure Data Factory as an alternative ETL (Extract, Transform, Load) tool. It is suitable for individuals with no prior experience in Microsoft Azure. All you need is a basic PC or laptop to get started.


Learning Objectives

  • Gain a comprehensive understanding of Azure Data Factory and Azure Blob Storage, including their functionalities and integration.
  • Explore key concepts in data engineering, data lakes, and metadata-driven frameworks, and how they apply to modern data solutions.
  • Study an industry-based case example to learn how to build efficient ingestion frameworks tailored for real-world applications.
  • Learn to create dynamic Azure Data Factory pipelines, incorporating email notifications and automation with Logic Apps.
  • Understand the process of tracking pipeline executions and batch runs for effective monitoring and optimization.
  • Explore version management practices with Azure DevOps to ensure streamlined collaboration and deployment processes.

Course Outline

The Azure Data Factory Exam covers the following topics - 

Module 1 - Introduction – Building Your First Azure Data Pipeline

  • Course Overview
  • Introduction to Azure Data Factory (ADF)
  • Discussion of Requirements and Technical Architecture
  • Register for a Free Azure Account
  • Create a Data Factory Resource
  • Set Up a Storage Account and Upload Data
  • Create a Data Lake Gen 2 Storage Account
  • Install Storage Explorer
  • Build Your First Azure Data Pipeline


Module 2 - Metadata-Driven Ingestion

  • Overview of Metadata-Driven Ingestion
  • High-Level Strategy
  • Create an Active Directory User
  • Assign the Contributor Role to the User
  • Disable Security Defaults
  • Set Up the Metadata Database
  • Install Azure Data Studio
  • Create Metadata Tables and Stored Procedures
  • Reconfigure Existing Data Factory Artifacts
  • Set Up a Logic App for Email Notifications
  • Modify Data Factory Pipeline to Include Email Notifications
  • Create Linked Services for Metadata Database and Email Datasets
  • Create a Utility Pipeline for Email Notifications
  • Explanation of the Email Recipients Table
  • Explanation of the Get Email Addresses Stored Procedure
  • Modify Ingestion Pipeline to Use Email Utility Pipeline
  • Monitor the Triggered Pipeline
  • Make Email Notifications Dynamic
  • Dynamic Logging of Pipeline Information
  • Add a New Logging Method for the Main Ingestion Pipeline
  • Modify Pipeline Logging to Only Send Failure Alerts
  • Create Dynamic Datasets
  • Source to Target Data Transfer – Part 1
  • Source to Target Data Transfer – Part 2
  • Explanation of the Source to Target Stored Procedure
  • Add Orchestration Pipeline – Part 1
  • Add Orchestration Pipeline – Part 2
  • Address Duplicate Batch Ingestions
  • Review the Pipeline Log and Related Tables
  • Understand the GetBatch Stored Procedure
  • Understand the Set Batch Status and GetRunID Procedures
  • Set Up an Azure DevOps Git Repository
  • Publish Data Factory to Azure DevOps


Module 3 - Event-Driven Ingestion

  • Introduction
  • Read Plan for Azure Storage
  • Create Finance Container and Upload Files
  • Create Source Dataset
  • Write Plan for Data Lake – Raw Data
  • Set Up Finance Container and Directories
  • Create Sink Dataset
  • Data Factory Pipeline Overview
  • Create Data Factory and Read Metadata
  • Apply Filter for CSV Files
  • Add Dataset to Read Files
  • Add the For Each CSV File Activity and Test Ingestion
  • Define Event-Based Trigger Plan
  • Enable Event Grid Provider
  • Delete File and Set Up Event-Based Trigger
  • Create Event-Based Trigger
  • Publish Code to Main Branch and Activate Trigger
  • Trigger Event-Based Ingestion

Tags: Azure Data Factory Practice Exam, Azure Data Factory Online Course, Azure Data Factory Training, Azure Data Factory Tutorial, Learn Azure Data Factory, Azure Data Factory Study Guide