DP-900 Study Guide 2026: Complete Exam Breakdown & Pass Strategy
Everything you need to pass the Microsoft Azure Data Fundamentals exam — all four domains explained, a 3-week study plan, and what actually appears on exam day.
Quick Summary
- • DP-900 is a Fundamentals-level exam: 40–60 questions, 60 minutes, 700/1000 passing score
- • Covers 4 domains: core data concepts, relational data, non-relational data, analytics workloads
- • Most candidates pass with 2–4 weeks of preparation
- • Exam cost: $165 USD
- • No prerequisites — good for beginners entering the data/cloud space
What is the DP-900 Exam?
The DP-900 Microsoft Azure Data Fundamentals exam validates your foundational knowledge of core data concepts and how they are implemented using Azure data services. It is a beginner-friendly certification in Microsoft's data certification path.
Passing DP-900 earns you the Microsoft Certified: Azure Data Fundamentals credential. Unlike role-based certifications like AZ-104, this is a knowledge-level exam — it tests your understanding of what Azure data services do rather than how to configure them in detail.
DP-900 is ideal for:
- • Business analysts and data professionals new to Azure
- • IT professionals who want to understand Azure's data services
- • Students entering data engineering or data science careers
- • Anyone preparing for DP-300 (Azure Database Administrator) or DP-203 (Azure Data Engineer)
| Detail | Information |
|---|---|
| Exam Code | DP-900 |
| Credential | Azure Data Fundamentals |
| Questions | 40–60 |
| Time Limit | 60 minutes |
| Passing Score | 700 out of 1000 |
| Price | $165 USD |
| Level | Fundamentals (Beginner) |
| Prerequisites | None |
| Renewal | Annual free online assessment |
DP-900 Exam Domains & Weightings
The DP-900 exam has four domains. Here's what each covers and how much weight it carries:
Domain 1: Describe core data concepts
25%- • Identify data formats (structured, semi-structured, unstructured)
- • Describe data storage options (file, database, object)
- • Describe common data processing concepts (batch vs stream)
- • Identify roles in data workloads (data engineer, data analyst, data scientist, DBA)
- • Describe OLTP vs OLAP workloads
- • Data file formats: CSV, JSON, Parquet, Avro, ORC
Domain 2: Identify considerations for relational data on Azure
25%- • Relational data concepts (tables, keys, normalization, relationships)
- • Identify Azure relational database services: Azure SQL Database, Azure SQL Managed Instance, Azure SQL on VM, Azure Database for PostgreSQL, MySQL, MariaDB
- • SQL fundamentals: SELECT, INSERT, UPDATE, DELETE, DDL vs DML
- • Database objects: views, stored procedures, indexes
Domain 3: Describe considerations for working with non-relational data on Azure
25%- • Non-relational data concepts (key-value, document, column-family, graph)
- • Azure Cosmos DB: APIs (SQL/Core, MongoDB, Cassandra, Gremlin, Table)
- • Azure Blob Storage: containers, access tiers (Hot, Cool, Cold, Archive)
- • Azure Table Storage: entities, partition key, row key
- • Azure File Storage: file shares, SMB protocol
Domain 4: Describe an analytics workload on Azure
25%- • Large-scale analytics concepts: data warehouse, data lake, data lakehouse
- • Azure Synapse Analytics: SQL pools, Spark pools, pipelines
- • Azure Data Factory: data integration, pipelines, datasets
- • Azure Databricks: big data processing, Apache Spark
- • Microsoft Fabric: unified analytics platform
- • Power BI: datasets, reports, dashboards, workspaces
3-Week DP-900 Study Plan
This plan assumes 1–1.5 hours of study per day. DP-900 is a fundamentals exam — the goal is breadth of understanding, not deep technical depth.
The Most Tested DP-900 Topics
When to use Cosmos DB vs Azure SQL Database
The exam loves "which service should you use for X requirement" questions. Know: Cosmos DB for globally distributed, high-scale, variable schema scenarios. Azure SQL for ACID transactions, complex relational queries, traditional structured data.
Azure Synapse vs Azure Data Factory vs Azure Databricks
Three analytics services with overlapping capabilities. Synapse is an integrated analytics workspace. ADF is a pure data integration/orchestration service. Databricks is Apache Spark-based big data processing. The exam tests which is appropriate for a given workload.
Cosmos DB API selection
Five APIs serve different use cases: Core (SQL) for document data with SQL-like queries, MongoDB for existing MongoDB apps, Cassandra for column-family data, Gremlin for graph data, Table for key-value (Azure Table Storage replacement).
Batch vs Stream processing
Batch: process large data sets on a schedule (e.g., nightly reports). Stream: process data as it arrives in real-time (e.g., IoT sensor alerts). Azure Stream Analytics for streaming. Azure Synapse / ADF for batch.
Power BI components
Dataset (data model), Report (visualizations on a single dataset), Dashboard (pinned visuals from multiple reports), Dataflow (cloud-based ETL). Power BI Desktop (authoring) vs Power BI Service (sharing/viewing).
Ready to Practice DP-900?
500 practice questions across all 4 domains. Start with 40 questions free.
Start Free Practice →