DP-900

DP-900 Study Guide 2026: Complete Exam Breakdown & Pass Strategy

Everything you need to pass the Microsoft Azure Data Fundamentals exam — all four domains explained, a 3-week study plan, and what actually appears on exam day.

By MSCertQuiz TeamUpdated March 202618 min read

Quick Summary

  • • DP-900 is a Fundamentals-level exam: 40–60 questions, 60 minutes, 700/1000 passing score
  • • Covers 4 domains: core data concepts, relational data, non-relational data, analytics workloads
  • • Most candidates pass with 2–4 weeks of preparation
  • • Exam cost: $165 USD
  • • No prerequisites — good for beginners entering the data/cloud space

What is the DP-900 Exam?

The DP-900 Microsoft Azure Data Fundamentals exam validates your foundational knowledge of core data concepts and how they are implemented using Azure data services. It is a beginner-friendly certification in Microsoft's data certification path.

Passing DP-900 earns you the Microsoft Certified: Azure Data Fundamentals credential. Unlike role-based certifications like AZ-104, this is a knowledge-level exam — it tests your understanding of what Azure data services do rather than how to configure them in detail.

DP-900 is ideal for:

  • • Business analysts and data professionals new to Azure
  • • IT professionals who want to understand Azure's data services
  • • Students entering data engineering or data science careers
  • • Anyone preparing for DP-300 (Azure Database Administrator) or DP-203 (Azure Data Engineer)
DetailInformation
Exam CodeDP-900
CredentialAzure Data Fundamentals
Questions40–60
Time Limit60 minutes
Passing Score700 out of 1000
Price$165 USD
LevelFundamentals (Beginner)
PrerequisitesNone
RenewalAnnual free online assessment

DP-900 Exam Domains & Weightings

The DP-900 exam has four domains. Here's what each covers and how much weight it carries:

Domain 1: Describe core data concepts

25%
  • Identify data formats (structured, semi-structured, unstructured)
  • Describe data storage options (file, database, object)
  • Describe common data processing concepts (batch vs stream)
  • Identify roles in data workloads (data engineer, data analyst, data scientist, DBA)
  • Describe OLTP vs OLAP workloads
  • Data file formats: CSV, JSON, Parquet, Avro, ORC

Domain 2: Identify considerations for relational data on Azure

25%
  • Relational data concepts (tables, keys, normalization, relationships)
  • Identify Azure relational database services: Azure SQL Database, Azure SQL Managed Instance, Azure SQL on VM, Azure Database for PostgreSQL, MySQL, MariaDB
  • SQL fundamentals: SELECT, INSERT, UPDATE, DELETE, DDL vs DML
  • Database objects: views, stored procedures, indexes

Domain 3: Describe considerations for working with non-relational data on Azure

25%
  • Non-relational data concepts (key-value, document, column-family, graph)
  • Azure Cosmos DB: APIs (SQL/Core, MongoDB, Cassandra, Gremlin, Table)
  • Azure Blob Storage: containers, access tiers (Hot, Cool, Cold, Archive)
  • Azure Table Storage: entities, partition key, row key
  • Azure File Storage: file shares, SMB protocol

Domain 4: Describe an analytics workload on Azure

25%
  • Large-scale analytics concepts: data warehouse, data lake, data lakehouse
  • Azure Synapse Analytics: SQL pools, Spark pools, pipelines
  • Azure Data Factory: data integration, pipelines, datasets
  • Azure Databricks: big data processing, Apache Spark
  • Microsoft Fabric: unified analytics platform
  • Power BI: datasets, reports, dashboards, workspaces

3-Week DP-900 Study Plan

This plan assumes 1–1.5 hours of study per day. DP-900 is a fundamentals exam — the goal is breadth of understanding, not deep technical depth.

Week 1: Data Concepts + Relational Data
Day 1–2Core data concepts: structured vs semi-structured vs unstructured data, batch vs stream processing, data roles (engineer, analyst, scientist, DBA)
Day 3Data file formats: CSV, JSON, Parquet, Avro. When to use each and why. Columnar vs row-based storage.
Day 4–5Relational data: tables, primary/foreign keys, normalization, SQL basics (SELECT, WHERE, JOIN, GROUP BY)
Day 6Azure relational services: Azure SQL Database vs SQL Managed Instance vs SQL on VM, PostgreSQL and MySQL on Azure
Day 7Practice questions: 20 questions on Domains 1 and 2. Review all incorrect answers.
Week 2: Non-Relational Data
Day 8–9Non-relational data types: key-value, document, column-family, graph. When to use each type. NoSQL vs SQL tradeoffs.
Day 10–11Azure Cosmos DB deep dive: global distribution, consistency levels, APIs (SQL, MongoDB, Cassandra, Gremlin, Table), partitioning
Day 12Azure Storage services: Blob Storage (access tiers), Table Storage (partition key/row key), File Storage, Queue Storage
Day 13Practice questions: 20 questions on Domain 3. Focus on Cosmos DB API selection scenarios.
Day 14Review and catch-up day. Re-read any sections from weeks 1–2 where you felt uncertain.
Week 3: Analytics + Mock Exams
Day 15–16Analytics concepts: data warehouse vs data lake vs data lakehouse. ELT vs ETL. Azure Synapse Analytics overview (SQL pools, Spark pools, serverless)
Day 17Azure Data Factory, Azure Databricks, Microsoft Fabric. What each does and when to use each vs Synapse.
Day 18Power BI: datasets, reports, dashboards, dataflows, Power BI Service vs Desktop. Report vs Dashboard distinction.
Day 19Full mock exam (40 questions, 60-minute timer). Score and review results.
Day 20Targeted review of weak areas from mock exam.
Day 21Second mock exam. Aim for 80%+ before booking. Light review only on Day 21 evening.

The Most Tested DP-900 Topics

When to use Cosmos DB vs Azure SQL Database

The exam loves "which service should you use for X requirement" questions. Know: Cosmos DB for globally distributed, high-scale, variable schema scenarios. Azure SQL for ACID transactions, complex relational queries, traditional structured data.

Azure Synapse vs Azure Data Factory vs Azure Databricks

Three analytics services with overlapping capabilities. Synapse is an integrated analytics workspace. ADF is a pure data integration/orchestration service. Databricks is Apache Spark-based big data processing. The exam tests which is appropriate for a given workload.

Cosmos DB API selection

Five APIs serve different use cases: Core (SQL) for document data with SQL-like queries, MongoDB for existing MongoDB apps, Cassandra for column-family data, Gremlin for graph data, Table for key-value (Azure Table Storage replacement).

Batch vs Stream processing

Batch: process large data sets on a schedule (e.g., nightly reports). Stream: process data as it arrives in real-time (e.g., IoT sensor alerts). Azure Stream Analytics for streaming. Azure Synapse / ADF for batch.

Power BI components

Dataset (data model), Report (visualizations on a single dataset), Dashboard (pinned visuals from multiple reports), Dataflow (cloud-based ETL). Power BI Desktop (authoring) vs Power BI Service (sharing/viewing).

Ready to Practice DP-900?

500 practice questions across all 4 domains. Start with 40 questions free.

Start Free Practice →