Exam DP-200: Implementing an Azure Data Solution

Candidates for this exam are Microsoft Azure data engineers who collaborate with business stakeholders to identify and meet the data requirements to implement data solutions that use Azure data services....

Azure data engineers are responsible for data-related tasks that include provisioning data storage services, ingesting streaming and batch data, transforming data, implementing security requirements, implementing data retention policies, identifying performance bottlenecks, and accessing external data sources.

Candidates for this exam must be able to implement data solutions that use the following Azure services: Azure Cosmos DB, Azure SQL Database, Azure SQL Data Warehouse, Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.


Part of the requirements for: Microsoft Certified: Azure Data Engineer Associate

Related exam: Exam DP-201

Schedule exam

Exam DP-200: Implementing an Azure Data Solution

Languages: English

This exam measures your ability to accomplish the following technical tasks: implement data storage solutions; manage and develop data processing; and monitor and optimize data solutions.

Schedule exam

Skills measured

Exam DP-200 Update Summary

This exam has been refocused to differentiate the data engineer from a database administrator, emphasizing the data engineer’s proficiency in implementing solutions for data storage in varying forms, structures, and purpose.

Updated content areas (effective June 21, 2019)

  • Reorganized implementation of data storage solutions to two key categories: relational and non-relational data stores
  • Removed focus on HDInsight
  • Merged monitoring and optimization
  • Reduced focus on Hybrid data scenarios
  • Removed Azure DevOps pipelines
  • Referenced authentication and authorization as applicable
  • Referenced data policies and standards as applicable
  • Referenced notifications as applicable

For more details, download the DP-200 change document (Adobe Acrobat PDF).

Implement data storage solutions (40-45%)

Implement non-relational data stores

  • implement a solution that uses Cosmos DB, Data Lake Storage Gen2, or Blob storage
  • implement data distribution and partitions
  • implement a consistency model in CosmosDB
  • provision a non-relational data store
  • provide access to data to meet security requirements
  • implement for high availability, disaster recovery, and global distribution

Implement relational data stores

  • configure elastic pools
  • configure geo-replication
  • provide access to data to meet security requirements
  • implement for high availability, disaster recovery, and global distribution
  • implement data distribution and partitions for SQL Data Warehouse
  • Implement PolyBase

Manage data security

  • implement data masking
  • encrypt data at rest and in motion

Manage and develop data processing (25-30%)

Develop batch processing solutions

  • develop batch processing solutions by using Data Factory and Azure Databricks
  • ingest data by using PolyBase
  • implement the integration runtime for Data Factory
  • create linked services and datasets
  • create pipelines and activities
  • create and schedule triggers
  • implement Azure Databricks clusters, notebooks, jobs, and autoscaling
  • ingest data into Azure Databricks

Develop streaming solutions

  • configure input and output
  • select the appropriate windowing functions
  • implement event processing using Stream Analytics

Monitor and optimize data solutions (30-35%)

Monitor data storage

  • monitor relational and non-relational data sources
  • implement BLOB storage monitoring
  • implement Data Lake Store monitoring
  • implement SQL Database monitoring
  • implement SQL Data Warehouse monitoring
  • implement Cosmos DB monitoring
  • configure Azure Monitor alerts
  • implement auditing by using Azure Log Analytics

Monitor data processing

  • design and implement Data Factory monitoring
  • monitor Azure Databricks
  • monitor HDInsight processing
  • monitor stream analytics

Optimize Azure data solutions

  • troubleshoot data partitioning bottlenecks
  • optimize Data Lake Storage
  • optimize Stream Analytics
  • optimize SQL Data Warehouse
  • optimize SQL Database
  • manage data life cycle

Prepare for exam


In-browser access
Start learning


Explore courses

Guide to training

All self-paced and instructor-led courses in one comprehensive guide.


Related certifications

Microsoft Certified: Azure Data Engineer Associate

Azure Data Engineers design and implement the management, monitoring, security, and privacy of data using the full stack of Azure data services to satisfy business needs.

* Pricing does not reflect any promotional offers or reduced pricing for Microsoft Imagine Academy program members, Microsoft Certified Trainers, and Microsoft Partner Network program members. Pricing is subject to change without notice. Pricing does not include applicable taxes. Please confirm exact pricing with the exam provider before registering to take an exam.

Additional resources

Guides to Training and Certifications

Explore all certifications in a concise training and certifications guide or the Training and Certifications poster.

Training guide

Discover training resources to become a Microsoft Certified: Azure Data Engineer Associate.

Exam Replay

See two great offers to help boost your odds of success.

Support for certification exams

Get help through Microsoft Certification support forums. A forum moderator will respond in one business day.