• Presencial
  • Training

DP-203: Data Engineering on Microsoft Azure

  • PROMO
    -15%

In this course, the student will learn about the data engineering as it pertains to working with batch and real-time analytical solutions using Azure data platform technologies. Students will begin by understanding the core compute and storage technologies that are used to build an analytical solution. The students will learn how to interactively explore data stored in files in a data lake.

They will learn the various ingestion techniques that can be used to load data using the Apache Spark capability found in Azure Synapse Analytics or Azure Databricks, or how to ingest using Azure Data Factory or Azure Synapse pipelines. The students will also learn the various ways they can transform the data using the same technologies that is used to ingest data.

They will understand the importance of implementing security to ensure that the data is protected at rest or in transit. The student will then show how to create a real-time analytical system to create real-time analytical solutions.

Cursos relacionados

Destinatários

  • The primary audience for this course is data professionals, data architects, and business intelligence professionals who want to learn about data engineering and building analytical solutions using data platform technologies that exist on Microsoft Azure.
  • The secondary audience for this course data analysts and data scientists who work with analytical solutions built on Microsoft Azure.

Pré-requisitos

Successful students start this course with knowledge of cloud computing and core data concepts and professional experience with data solutions. Specifically completing:

Objetivos

  • Prepare for Exam DP-203: Data Engineering on Microsoft Azure
  • Create a Storage Account, and choose the right model for the data you want to store in the cloud
  • Create and manage data pipelines in the cloud using Azure Data Factory
  • Explore the tools and techniques that can be used to work with Modern Data Warehouses productively and securely within Azure Synapse Analytics
  • Perform data engineering with Azure Synapse Apache Spark Pools
  • Harness the power of Apache Spark and powerful clusters running on the Azure Databricks platform to run large data engineering workloads in the cloud
  • Explore how it fits into common architectures, as well as the different methods of uploading the data to the data store
  • Examine the myriad of security features that will ensure your data is secure

Programa

  • Azure for the Data Engineer
  • Store data in Azure
  • Data integration at scale with Azure Data Factory or Azure Synapse Pipeline
  • Realize Integrated Analytical Solutions with Azure Synapse Analytics
  • Work with Data Warehouses using Azure Synapse Analytics
  • Perform data engineering with Azure Synapse Apache Spark Pools
  • Work with Hybrid Transactional and Analytical Processing Solutions using Azure Synapse Analytics
  • Data engineering with Azure Databricks
  • Large-Scale Data Processing with Azure Data Lake Storage Gen2
  • Implement a Data Streaming Solution with Azure Stream Analytics

Azure for the Data Engineer

  • Understand the evolving world of data
  • Survey the services on the Microsoft Intelligent Data Platform
  • Identify the tasks of a data engineer in a cloud-hosted architecture

Store data in Azure

  • Choose a data storage approach in Azure
  • Create an Azure Storage account
  • Connect an app to Azure Storage
  • Secure your Azure Storage account
  • Store application data with Azure Blob storage

Data integration at scale with Azure Data Factory or Azure Synapse Pipeline

  • Integrate data with Azure Data Factory or Azure Synapse Pipeline
  • Petabyte-scale ingestion with Azure Data Factory or Azure Synapse Pipeline
  • Perform code-free transformation at scale with Azure Data Factory or Azure Synapse Pipeline
  • Populate slowly changing dimensions in Azure Synapse Analytics pipelines
  • Orchestrate data movement and transformation in Azure Data Factory or Azure Synapse Pipeline
  • Execute existing SSIS packages in Azure Data Factory or Azure Synapse Pipeline
  • Operationalize your Azure Data Factory or Azure Synapse Pipeline

Realize Integrated Analytical Solutions with Azure Synapse Analytics

  • Introduction to Azure Synapse Analytics
  • Survey the Components of Azure Synapse Analytics
  • Explore Azure Synapse Studio
  • Design a Modern Data Warehouse using Azure Synapse Analytics

Work with Data Warehouses using Azure Synapse Analytics

  • Design a Modern Data Warehouse using Azure Synapse Analytics
  • Analyze data in a relational data warehouse
  • Use data loading best practices in Azure Synapse Analytics
  • Optimize data warehouse query performance in Azure Synapse Analytics
  • Integrate SQL and Apache Spark pools in Azure Synapse Analytics
  • Understand data warehouse developer features of Azure Synapse Analytics
  • Manage and monitor data warehouse activities in Azure Synapse Analytics
  • Analyze and optimize data warehouse storage in Azure Synapse Analytics
  • Secure a data warehouse in Azure Synapse Analytics

Perform data engineering with Azure Synapse Apache Spark Pools

  • Analyze data with Apache Spark in Azure Synapse Analytics
  • Use Delta Lake in Azure Synapse Analytics
  • Integrate SQL and Apache Spark pools in Azure Synapse Analytics
  • Monitor and manage data engineering workloads with Apache Spark in Azure Synapse Analytics

Work with Hybrid Transactional and Analytical Processing Solutions using Azure Synapse Analytics

  • Plan hybrid transactional and analytical processing using Azure Synapse Analytics
  • Implement Azure Synapse Link with Azure Cosmos DB
  • Implement Azure Synapse Link for SQL

Data engineering with Azure Databricks

  • Explore Azure Databricks
  • Use Apache Spark in Azure Databricks
  • Use Delta Lake in Azure Databricks
  • Use SQL Warehouses in Azure Databricks
  • Run Azure Databricks Notebooks with Azure Data Factory

Large-Scale Data Processing with Azure Data Lake Storage Gen2

  • Introduction to Azure Data Lake storage
  • Upload data to Azure Data Lake Storage
  • Secure your Azure Storage account

Implement a Data Streaming Solution with Azure Stream Analytics

  • Get started with Azure Stream Analytics
  • Ingest streaming data using Azure Stream Analytics and Azure Synapse Analytics
  • Visualize real-time data with Azure Stream Analytics and Power BI

Ao concluir com aproveitamento esta formação, cumprindo a percentagem mínima de 70% de assiduidade e após avaliação ao curso, o formando poderá receber o seu Certificado Microsoft de conclusão e o badge digital para partilhar com a sua rede profissional online.

Inscreva-se

Dados Pessoais

Dados para faturação

   Os seus dados pessoais são recolhidos em conformidade com o Regulamento Geral de Proteção de Dados (RGPD).Consente que os seus dados sejam utilizados, nos termos da nossa Politica de Privacidade, para o contacto/envio de:

   Ações de informação, de marketing de produtos e serviços, como campanhas e eventos?

Para mais informações, consulte a Política de Privacidade do Grupo Rumos. Pode retirar o seu consentimento a qualquer momento, através do botão “Cancelar subscrição” ou “Unsubscribe” que estão presentes em cada comunicação enviada, bem como exercer os direitos descritos na política de privacidade

DP-203: Data Engineering on Microsoft Azure

  • Datas
    22 Nov a 25 Nov 2021
    Live Training
  • Horário
    Laboral
    das 9h00 às 17h00
  • Nº Horas
    28
  • Preço
    1510€

DP-203: Data Engineering on Microsoft Azure

Área

Dados

Como chegou até nós

Os seus dados pessoais são recolhidos em conformidade com o Regulamento Geral de Proteção de Dados (RGPD).

Consente que os seus dados sejam utilizados, nos termos da nossa Politica de Privacidade, para o contacto/envio de:

Ações de informação, de marketing de produtos e serviços, como campanhas e eventos?

Para mais informações, consulte a Política de Privacidade do Grupo Rumos.
pode retirar o seu consentimento a qualquer momento através do botão Cancelar subscrição ou Unsubscribe que estão presentes em cada comunicação enviada, bem como exercer os direitos descritos na politica de privacidade