• Professional Development
  • Medicine & Nursing
  • Arts & Crafts
  • Health & Wellbeing
  • Personal Development

Course Images

Azure Data Factory for Beginners - Build Data Ingestion

Azure Data Factory for Beginners - Build Data Ingestion

  • 30 Day Money Back Guarantee
  • Completion Certificate
  • 24/7 Technical Support

Highlights

  • On-Demand course

  • 6 hours 29 minutes

  • All levels

Description

A beginner's level course that will help you learn data engineering techniques for building metadata-driven frameworks with Azure data engineering tools such as Data Factory, Azure SQL, and others. You need not have any prior experience in Azure Data Factory to take up this course.

Building frameworks is now an industry norm and it has become an important skill to know how to visualize, design, plan, and implement data frameworks. The framework that we are going to build together is the Metadata-Driven Ingestion Framework. Metadata-driven frameworks allow a company to develop the system just once and it can be adopted and reused by various business clusters without the need for additional development, thus saving the business time and costs. Think of it as a plug-and-play system. The first objective of the course is to onboard you onto the Azure Data Factory platform to help you assemble your first Azure Data Factory pipeline. Once you get a good grip on the Azure Data Factory development pattern, then it becomes easier to adopt the same pattern to onboard other sources and data sinks. Once you are comfortable with building a basic Azure Data Factory pipeline, as a second objective, we then move on to building a fully-fledged and working metadata-driven framework to make the ingestion more dynamic; furthermore, we will build the framework in such a way that you can audit every batch orchestration and individual pipeline runs for business intelligence and operational monitoring. By the end of this course, you will be able to design, implement, and get production-ready for data ingestion in Azure. All the resource files are added to the GitHub repository at: https://github.com/PacktPublishing/Azure-Data-Factory-for-Beginners---Build-Data-Ingestion

What You Will Learn

Learn about Azure Data Factory and Azure Blob Storage
Understand data engineering, data lake, and metadata-driven frameworks concepts
Look at the industry-based example of how to build ingestion frameworks
Learn dynamic Azure Data Factory pipelines and email notifications with logic apps
Study tracking of pipelines and batch runs
Look at version management with Azure DevOps

Audience

This course is ideal for aspiring data engineers and developers that are curious about Azure Data Factory as an ETL alternative.

You will need a basic PC/laptop; no prior knowledge of Microsoft Azure is required.

Approach

This course is comprehensive and well-structured into three parts: creating your first pipeline, metadata-driven ingestion, and event-driven ingestion. This highly practical course with real-life examples to learn how data engineers actually work on data analysis will leverage your learning further. The theoretical concepts are explained with beautiful animations to keep you engaged thoroughly.

Key Features

A beginner-friendly and comprehensive course on designing and implementing Azure Data pipeline ingestion * Industry-based along with tips and tricks for production-ready data ingestion in an Azure project * The highly practical course along with the theoretical concepts animated for better interactivity

Github Repo

https://github.com/PacktPublishing/Azure-Data-Factory-for-Beginners---Build-Data-Ingestion

About the Author
David Mngadi

David Mngadi is a data management professional who is influenced by the power of data in our lives and has helped several companies become more data-driven to gain a competitive edge as well as meet the regulatory requirements. In the last 15 years, he has had the pleasure of designing and implementing data warehousing solutions in retail, telco, and banking industries, and recently in more big data lake-specific implementations. He is passionate about technology and teaching programming online.

Course Outline

1. Introduction - Build Your First Azure Data Pipeline

1. Introduction to the Course

This video provides an introduction to the course and basics of ADF.

2. Introduction to ADF (Azure Data Factory)

This video provides an introduction to ADF (Azure Data Factory).

3. Requirements Discussion and Technical Architecture

This video talks about the requirements discussion and technical architecture.

4. Register a Free Azure Account

This video explains how to register a free Azure account.

5. Create a Data Factory Resource

This video helps you create a Data Factory resource.

6. Create a Storage Account and Upload Data

This video explains how to create a storage account and upload data.

7. Create Data Lake Gen 2 Storage Account

This video demonstrates how to create Data Lake Gen 2 storage account.

8. Download Storage Explorer

This video shows you how to download Storage Explorer.

9. Create Your First Azure Pipeline

This video talks about creating your first Azure pipeline.

10. Closing Remarks

This video concludes the section with the closing remarks.

2. Metadata-Driven Ingestion

1. Introduction to Metadata-Driven Ingestion

This video provides an introduction to metadata-driven ingestion.

2. High-Level Plan

This video talks about the high-level plan you would be working on in this section.

3. Create Active Directory User

This video explains how to create an Active Directory user.

4. Assign the Contributor Role to the User

This video explains how to assign the contributor role to the user.

5. Disable Security Defaults

This video explains how to disable security defaults.

6. Creating the Metadata Database

This video talks about creating the metadata database.

7. Install Azure Data Studio

This video demonstrates installing Azure Data Studio.

8. Create Metadata Tables and Stored Procedures

This video explains creating metadata tables and stored procedures.

9. Reconfigure Existing Data Factory Artifacts

This video helps you reconfigure existing Data Factory artifacts.

10. Set Up Logic App to Handle Email Notifications

This video explains setting up logic app to handle email notifications.

11. Modify the Data Factory Pipeline to Send an Email Notification

This video shows how to modify the Data Factory pipeline to send an email notification.

12. Create Linked Service for Metadata Database and Email Dataset

This video demonstrates creating a linked service for metadata database and email dataset.

13. Create Utility Pipeline to Send Email Notifications

This video explains how to create a utility pipeline to send email notifications.

14. Explaining the Email Recipients Table

This video helps in explaining the 'Email recipients' table.

15. Explaining the Get Email Addresses Stored Procedure

This video explains the get email addresses stored procedure.

16. Modify Ingestion Pipeline to Use the Email Utility Pipeline

This video helps you modify the ingestion pipeline to use the email utility pipeline.

17. Tracking the Triggered Pipeline

This video lets you track the triggered pipeline.

18. Making the Email Notifications Dynamic

This video demonstrates making the email notifications dynamic.

19. Making Logging of Pipeline Information Dynamic

This video shows how to make the logging of pipeline information dynamic.

20. Add a New Way to Log the Main Ingestion Pipeline

This video explains how to add a new way to log the main ingestion pipeline.

21. Change the Logging of Pipelines to Send Fail Message Only

This video explains how to change the logging of pipelines to send a failure message only.

22. Creating Dynamic Datasets

This video talks about creating dynamic datasets.

23. Reading from Source to Target - Part 1

This is the first of the two-part video that helps in reading from source to target.

24. Reading from Source to Target - Part 2

This is the second of the two-part video that helps in reading from source to target.

25. Explaining the Source to Target Stored Procedure

This video helps in explaining the source to target stored procedure.

26. Add Orchestration Pipeline - Part 1

This is the first of the two-part video that explains how to add orchestration pipeline.

27. Add Orchestration Pipeline - Part 2

This is the second of the two-part video that explains how to add orchestration pipeline.

28. Fixing the Duplicating Batch Ingestions

This video explains fixing the duplicating batch ingestions.

29. Understanding the Pipeline Log and Related Tables

This video helps in understanding the pipeline log and related tables.

30. Understanding the GetBatch Stored Procedure

This video helps in understanding the GetBatch stored procedure.

31. Understanding the Set Batch Status and GetRunID

This video helps in understanding set batch status and GetRunID

32. Setting Up an Azure DevOps Git Repository

This video demonstrates setting up an Azure DevOps Git repository.

33. Publishing the Data Factory to Azure DevOps

This video helps in publishing the Data Factory to Azure DevOps.

34. Closing Remarks

This video concludes the section with the closing remarks.

3. Event-Driven Ingestion

1. Introduction

This video provides an introduction to the section.

2. Read from Azure Storage Plan

This video explains how to read from Azure Storage Plan.

3. Create Finance Container and Upload Files

This video explains how to create finance container and upload files.

4. Create Source Dataset

This video helps in creating source dataset.

5. Write to Data Lake - Raw Plan

This video explains how to write to Data Lake - raw plan.

6. Create Finance Container and Directories

This video helps you create finance container and directories.

7. Create Sink Dataset

This video demonstrates creating sink dataset.

8. Data Factory Pipeline Plan

This video explains the Data Factory pipeline plan.

9. Create Data Factory and Read Metadata

This video explains creating Data Factory and reading metadata.

10. Add Filter by CSV

This video demonstrates how to add filter by CSV.

11. Add Dataset to Read Files

This video talks about adding dataset to read files.

12. Add the For Each CSV File Activity and Test Ingestion

This video explains how to add the For Each CSV file activity and test ingestion.

13. Adding the Event-Based Trigger Plan

This video demonstrates adding the event-based trigger plan.

14. Enable the Event Grid Provider

This video demonstrates enabling the event grid provider.

15. Delete File and Add Event-Based Trigger

This video demonstrates about deleting file and adding event-based trigger.

16. Create Event-Based Trigger

This video helps you create an event-based trigger.

17. Publish Code to Main Branch and Start Trigger

This video lets you publish the code to the main branch and start the trigger.

18. Trigger Event-Based Ingestion

This video helps you trigger event-based ingestion.

19. Closing Remarks

This video concludes the section with the closing remarks.

Course Content

  1. Azure Data Factory for Beginners - Build Data Ingestion

About The Provider

Packt
Packt
Birmingham
Founded in 2004 in Birmingham, UK, Packt’s mission is to help the world put software to work in new ways, through the delivery of effective learning and i...
Read more about Packt

Tags

Reviews