Duration 5 Days 30 CPD hours This course is intended for The primary audience for this course is database professionals who need to fulfil a Business Intelligence Developer role. They will need to focus on hands-on work creating BI solutions including Data Warehouse implementation, ETL, and data cleansing. Overview Create sophisticated SSIS packages for extracting, transforming, and loading data Use containers to efficiently control repetitive tasks and transactions Configure packages to dynamically adapt to environment changes Use Data Quality Services to cleanse data Successfully troubleshoot packages Create and Manage the SSIS Catalog Deploy, configure, and schedule packages Secure the SSIS Catalog SQL Server Integration Services is the Community Courseware version of 20767CC Implementing a SQL Data Warehouse. This five-day instructor-led course is intended for IT professionals who need to learn how to use SSIS to build, deploy, maintain, and secure Integration Services projects and packages, and to use SSIS to extract, transform, and load data to and from SQL Server. This course is similar to the retired Course 20767-C: Implementing a SQL Data Warehouse but focuses more on building packages, rather than the entire data warehouse design and implementation. Prerequisites Working knowledge of T-SQL and SQL Server Agent jobs is helpful, but not required. Basic knowledge of the Microsoft Windows operating system and its core functionality. Working knowledge of relational databases. Some experience with database design. 1 - SSIS Overview Import/Export Wizard Exporting Data with the Wizard Common Import Concerns Quality Checking Imported/Exported Data 2 - Working with Solutions and Projects Working with SQL Server Data Tools Understanding Solutions and Projects Working with the Visual Studio Interface 3 - Basic Control Flow Working with Tasks Understanding Precedence Constraints Annotating Packages Grouping Tasks Package and Task Properties Connection Managers Favorite Tasks 4 - Common Tasks Analysis Services Processing Data Profiling Task Execute Package Task Execute Process Task Expression Task File System Task FTP Task Hadoop Task Script Task Introduction Send Mail Task Web Service Task XML Task 5 - Data Flow Sources and Destinations The Data Flow Task The Data Flow SSIS Toolbox Working with Data Sources SSIS Data Sources Working with Data Destinations SSIS Data Destinations 6 - Data Flow Transformations Transformations Configuring Transformations 7 - Making Packages Dynamic Features for Making Packages Dynamic Package Parameters Project Parameters Variables SQL Parameters Expressions in Tasks Expressions in Connection Managers After Deployment How It All Fits Together 8 - Containers Sequence Containers For Loop Containers Foreach Loop Containers 9 - Troubleshooting and Package Reliability Understanding MaximumErrorCount Breakpoints Redirecting Error Rows Logging Event Handlers Using Checkpoints Transactions 10 - Deploying to the SSIS Catalog The SSIS Catalog Deploying Projects Working with Environments Executing Packages in SSMS Executing Packages from the Command Line Deployment Model Differences 11 - Installing and Administering SSIS Installing SSIS Upgrading SSIS Managing the SSIS Catalog Viewing Built-in SSIS Reports Managing SSIS Logging and Operation Histories Automating Package Execution 12 - Securing the SSIS Catalog Principals Securables Grantable Permissions Granting Permissions Configuring Proxy Accounts Additional course details: Nexus Humans 55321 SQL Server Integration Services training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the 55321 SQL Server Integration Services course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.
Duration 4 Days 24 CPD hours This course is intended for The primary audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about data engineering and building analytical solutions using data platform technologies that exist on Microsoft Azure. The secondary audience for this course includes data analysts and data scientists who work with analytical solutions built on Microsoft Azure. In this course, the student will learn how to implement and manage data engineering workloads on Microsoft Azure, using Azure services such as Azure Synapse Analytics, Azure Data Lake Storage Gen2, Azure Stream Analytics, Azure Databricks, and others. The course focuses on common data engineering tasks such as orchestrating data transfer and transformation pipelines, working with data files in a data lake, creating and loading relational data warehouses, capturing and aggregating streams of real-time data, and tracking data assets and lineage. Prerequisites Successful students start this course with knowledge of cloud computing and core data concepts and professional experience with data solutions. AZ-900T00 Microsoft Azure Fundamentals DP-900T00 Microsoft Azure Data Fundamentals 1 - Introduction to data engineering on Azure What is data engineering Important data engineering concepts Data engineering in Microsoft Azure 2 - Introduction to Azure Data Lake Storage Gen2 Understand Azure Data Lake Storage Gen2 Enable Azure Data Lake Storage Gen2 in Azure Storage Compare Azure Data Lake Store to Azure Blob storage Understand the stages for processing big data Use Azure Data Lake Storage Gen2 in data analytics workloads 3 - Introduction to Azure Synapse Analytics What is Azure Synapse Analytics How Azure Synapse Analytics works When to use Azure Synapse Analytics 4 - Use Azure Synapse serverless SQL pool to query files in a data lake Understand Azure Synapse serverless SQL pool capabilities and use cases Query files using a serverless SQL pool Create external database objects 5 - Use Azure Synapse serverless SQL pools to transform data in a data lake Transform data files with the CREATE EXTERNAL TABLE AS SELECT statement Encapsulate data transformations in a stored procedure Include a data transformation stored procedure in a pipeline 6 - Create a lake database in Azure Synapse Analytics Understand lake database concepts Explore database templates Create a lake database Use a lake database 7 - Analyze data with Apache Spark in Azure Synapse Analytics Get to know Apache Spark Use Spark in Azure Synapse Analytics Analyze data with Spark Visualize data with Spark 8 - Transform data with Spark in Azure Synapse Analytics Modify and save dataframes Partition data files Transform data with SQL 9 - Use Delta Lake in Azure Synapse Analytics Understand Delta Lake Create Delta Lake tables Create catalog tables Use Delta Lake with streaming data Use Delta Lake in a SQL pool 10 - Analyze data in a relational data warehouse Design a data warehouse schema Create data warehouse tables Load data warehouse tables Query a data warehouse 11 - Load data into a relational data warehouse Load staging tables Load dimension tables Load time dimension tables Load slowly changing dimensions Load fact tables Perform post load optimization 12 - Build a data pipeline in Azure Synapse Analytics Understand pipelines in Azure Synapse Analytics Create a pipeline in Azure Synapse Studio Define data flows Run a pipeline 13 - Use Spark Notebooks in an Azure Synapse Pipeline Understand Synapse Notebooks and Pipelines Use a Synapse notebook activity in a pipeline Use parameters in a notebook 14 - Plan hybrid transactional and analytical processing using Azure Synapse Analytics Understand hybrid transactional and analytical processing patterns Describe Azure Synapse Link 15 - Implement Azure Synapse Link with Azure Cosmos DB Enable Cosmos DB account to use Azure Synapse Link Create an analytical store enabled container Create a linked service for Cosmos DB Query Cosmos DB data with Spark Query Cosmos DB with Synapse SQL 16 - Implement Azure Synapse Link for SQL What is Azure Synapse Link for SQL? Configure Azure Synapse Link for Azure SQL Database Configure Azure Synapse Link for SQL Server 2022 17 - Get started with Azure Stream Analytics Understand data streams Understand event processing Understand window functions 18 - Ingest streaming data using Azure Stream Analytics and Azure Synapse Analytics Stream ingestion scenarios Configure inputs and outputs Define a query to select, filter, and aggregate data Run a job to ingest data 19 - Visualize real-time data with Azure Stream Analytics and Power BI Use a Power BI output in Azure Stream Analytics Create a query for real-time visualization Create real-time data visualizations in Power BI 20 - Introduction to Microsoft Purview What is Microsoft Purview? How Microsoft Purview works When to use Microsoft Purview 21 - Integrate Microsoft Purview and Azure Synapse Analytics Catalog Azure Synapse Analytics data assets in Microsoft Purview Connect Microsoft Purview to an Azure Synapse Analytics workspace Search a Purview catalog in Synapse Studio Track data lineage in pipelines 22 - Explore Azure Databricks Get started with Azure Databricks Identify Azure Databricks workloads Understand key concepts 23 - Use Apache Spark in Azure Databricks Get to know Spark Create a Spark cluster Use Spark in notebooks Use Spark to work with data files Visualize data 24 - Run Azure Databricks Notebooks with Azure Data Factory Understand Azure Databricks notebooks and pipelines Create a linked service for Azure Databricks Use a Notebook activity in a pipeline Use parameters in a notebook Additional course details: Nexus Humans DP-203T00 Data Engineering on Microsoft Azure training program is a workshop that presents an invigorating mix of sessions, lessons, and masterclasses meticulously crafted to propel your learning expedition forward. This immersive bootcamp-style experience boasts interactive lectures, hands-on labs, and collaborative hackathons, all strategically designed to fortify fundamental concepts. Guided by seasoned coaches, each session offers priceless insights and practical skills crucial for honing your expertise. Whether you're stepping into the realm of professional skills or a seasoned professional, this comprehensive course ensures you're equipped with the knowledge and prowess necessary for success. While we feel this is the best course for the DP-203T00 Data Engineering on Microsoft Azure course and one of our Top 10 we encourage you to read the course outline to make sure it is the right content for you. Additionally, private sessions, closed classes or dedicated events are available both live online and at our training centres in Dublin and London, as well as at your offices anywhere in the UK, Ireland or across EMEA.