Data and Analytics - Category Banner

Microsoft DP-3011 - Implement a Data Analytics Solution with Azure Databricks

  • Length 1 day
Course overview
View dates &
book now

Why study this course

This one-day course explores how to use Databricks and Apache Spark on Azure to take data projects from exploration to production. You’ll learn how to ingest, transform, and analyse large-scale datasets with Spark DataFrames, Spark SQL, and PySpark, while also building confidence in managing distributed data processing. Along the way, you’ll get hands-on with the Databricks workspace - navigating clusters and creating and optimising Delta tables. You’ll also dive into data engineering practices, including designing ETL pipelines, handling schema evolution, and enforcing data quality. The course then moves into orchestration, showing you how to automate and manage workloads with Lakeflow Jobs and pipelines. To round things out, you’ll explore governance and security capabilities such as Unity Catalog and Purview integration, ensuring you can work with data in a secure, well-managed, and production-ready environment.

Request Course Information


What you’ll learn

After completing this course, students will be able to:

  • Identify core workloads for Azure Databricks

  • Use Data Governance tools Unity Catalog and Microsoft Purview

  • Ingest data using Azure Databricks

  • Using the different data exploration tools in Azure Databricks

  • Analyse data with DataFrame APIs

  • Describe key elements of the Apache Spark architecture

  • Create and configure a Spark cluster

  • Describe use cases for Spark

  • Use Spark to process and analyse data stored in files

  • Use Spark to visualise data

  • How to create Delta Tables

  • How to use schema versioning and time travel in Delta Lake

  • How to maintain data integrity with Delta Lake

  • Describe Lakeflow Declarative Pipelines

  • Ingest data into Lakeflow Declarative Pipelines

  • Use Lakeflow Declarative Pipelines for real time data processing

  • The key components and benefits of Lakeflow Jobs

  • How to deploy workloads using Lakeflow Jobs


Microsoft Solutions Partner - Cloud - Training Services Logo

Microsoft Azure at Lumify Work

As part of Lumify Group, Lumify Work has skilled more people in Microsoft technologies than any other organisation in Australia and New Zealand. We have a campus in the Philippines, too. We offer the broadest range of instructor-led training courses, from end user to architect level.  We are proud to be the winner of the Microsoft MCT Superstars Award for FY24, which formally recognises us as having the highest quality Microsoft Certified Trainers in ANZ. 


Who is the course for?

This course is designed for data professionals who want to strengthen their skills in building and managing data solutions on Azure Databricks. It’s a good fit if you’re a data engineer, data analyst, or developer with some prior experience in Python, SQL, and basic cloud concepts, and you’re looking to move beyond small-scale analysis into scalable, production-ready data processing. Whether your goal is to modernise analytics workflows, optimise pipelines, or better manage and govern data at scale, this course will equip you with the practical skills to succeed.


Course subjects

  • Explore Azure Databricks

  • Perform data analysis with Azure Databricks

  • Use Apache Spark in Azure Databricks

  • Manage data with Delta Lake

  • Build Lakeflow Declarative Pipelines

  • Deploy workloads with Azure Databricks Workflows


Prerequisites

You should already be comfortable with the fundamentals of Python and SQL. This includes being able to write simple Python scripts and work with common data structures, as well as writing SQL queries to filter, join, and aggregate data. A basic understanding of common file formats such as CSV, JSON, or Parquet will also help when working with datasets.

In addition, familiarity with the Azure portal and core services like Azure Storage is important, along with a general awareness of data concepts such as batch versus streaming processing and structured versus unstructured data. While not mandatory, prior exposure to big data frameworks like Spark, and experience working with Jupyter notebooks, can make the transition to Databricks smoother.


Microsoft - Training Solutions Partner - Microsoft Certified Trainers - MCT Superstars Award FY24


Terms & Conditions

The supply of this course by Lumify Work is governed by the booking terms and conditions. Please read the terms and conditions carefully before enrolling in this course, as enrolment in the course is conditional on acceptance of these terms and conditions.


Request Course Information

Select and book a course

November
May
November
May

Can't find a date you like?

Contact sales


Offers

Microsoft Bundles
Master Microsoft skills, from fundamentals to advanced levels. Choose from bundles or private class options and SAVE up to 35% on training costs.