Preparing For The Salesforce ‘Data Architecture & Management Designer’ Exam

You are currently viewing Preparing For The Salesforce ‘Data Architecture & Management Designer’ Exam

It had been almost two months since I took a Salesforce certification exam so I was getting a tad bit fidgety and decided to go for the Salesforce ‘Data Architecture & Management Designer’ exam. This certification is a part of the Architect credential path (1) along with a few others like ‘Development Lifecycle and Deployment‘ and ‘Integration Architecture‘. A few folks around me who passed this exam claimed that a good understanding of data and security/sharing model in an LDV (large data volume) environment coupled with awareness of best practices around LDV migration would significantly help when preparing for the exam. They couldn’t have been more correct since I paid heed to their advice and was able to pass the exam.

The exam has 60 multiple-choice questions with a passing score of 70% and you get 90 minutes to complete the exam. Here is the Credential Overview which is also available on the official Salesforce certification website:


The Salesforce Certified Data Architecture and Management credential is designed for those who assess the architecture environment and requirements and design sound, scalable, and high-performing solutions on the platform as it pertains to enterprise data management.

Here are some examples of the concepts you should understand to pass the exam:

  • Aware of platform-specific design patterns and key limits
  • Understand large data volume considerations, risks, and mitigation strategies
  • Understand LDV considerations with communities
  • Ability to design a data and sharing model that supports an LDV environment
  • Understand data movement best practices in an LDV environment
  • Understand strategies to build an optimized and high-performing solution


As always, I will list the most vital topics that you need to focus on for the exam:

  • Timeout Issues during data loads and how to avoid them
  • Types of API and which one is best suited for LDV
  • PK Chunking
  • Ways to optimize performance of a data migration
  • Validation Rules, Apex triggers and Workflow Rules and their role in promoting good data quality
  • Duplicate Rules and Matching Rules
  • Field history tracking
  • Query Optimization and indexing
  • Auditing Metadata changes in Salesforce
  • Relationship types in Salesforce
  • Third party ETL tools and their role in data manipulation
  • Master Data Management
  • Role of Reports and Dashboards in promoting/monitoring data quality
  • Best practices around Data Archival in Salesforce
  • Using Standard vs Custom objects in Salesforce


If you have a firm grip over all the above topics, I am sure that you will be able to tackle 70-80% of the total exam content and as you must have noticed, that is enough to pass the exam. Also, don’t forget to thoroughly read the following documents:

Best Practices for Deployments with Large Data Volumes

The Salesforce Bulk API – Maximizing Parallelism and Throughput Performance When Integrating or Loading Large Data Volumes


Good luck!