Two Steps to Optimize Your Data Model and Avoid CRM Performance Degradation

“In our organization, each member of our staff amasses hundreds (sometimes thousands) of records daily for the cases they work on. It’s an overflowing amount of data! What’s the best way to manage this volume of data to ensure there is no degradation in Salesforce performance?”
This question is more common than you may think! Though not all of you deal with thousands of records on a daily basis, understanding how to resolve data management issues is something that all Salesforce customers should know. A solid data management and architecture structure to deal with large volumes of data sets you up for success now and in the future.
The key thing to remember is: You, yes you, should worry about Large Data Volume (LDV).
Though there’s no one measurement that can be used to determine whether or not your organization that has a large volume of data, both large and small orgs are at risk. While the number of records in an org is a reliable indicator, the way the data is structured is another factor. Organizations with smaller data volumes but poorly architected data models can still have performance issues related to data. In general, though, an organization will most likely be considered to have large volumes of data if meets one or more of the following qualifications:
- More than 5 million Records
- Thousands of users with concurrent access
- Parent Objects with more than 10,000 child records
- More than 100GB of used storage space
Issues that can be caused by Large Data Volumes
Large amounts of data volume typically show themselves as performance-related issues. This includes issues around longer than expected search times, long wait times for fields on a screen to populate when a record is accessed, long record saving times, and other actions taking excessively long amounts of time to complete.
In addition to performance issues, large data volumes can also lead to record locking issues on parent records with large number of child records. Every time a child record is saved, its parent record is temporarily locked, so if thousands of child records are being accessed and updated simultaneously, their parent records will remain locked for extended periods of time. A huge inconvenience.
So let’s talk now about some solutions and proactive steps you can take to protect your org’s performance and see success with Salesforce.org Nonprofit Cloud, Education Cloud, and more.
1. Develop an Effective Archiving and Reporting Strategy
Salesforce provides a wide variety of tools to help organizations with large data volumes get a handle on their data. These tools can do wonders for an organization, but they won’t be nearly as effective without an underlying plan for their proper utilization. So if you’re going to take one thing away from this blog post, it should be that the most important aspect of handling any amount of data is to create an effective data management and reporting strategy before implementing a technical solution.
Some of the data management related questions you’re going to want to consider are:
What does my current data model look like? | Create a data model diagram and roadmap that show your current and potential future states for any objects you’re utilizing. Then highlight areas where large numbers of records may be a concern.
For help: Trailhead: Data Modeling |
How long do I need to retain my data? | Think about retention requirements for each object. Donations may need to be retained for tax reporting purposes. Student Records may need to be retained for a certain period of time. Document your retention policy. |
What other systems does my Salesforce data need to flow to? | Create a data flow diagram to show source and target applications, which data elements flow between them and their associated volumes other attributes, to ensure you won’t cause any downstream effects in other systems. |
What are the source systems for my Salesforce data? | Create a data flow diagram to display data that’s created in other systems and replicated to Salesforce, or whether external data really should remain in its source system and be accessed via Salesforce lookups. |
How will archiving or purging an object affect related objects? | Document whether your organization has any scenario where child records would be required to be retained longer than their parents, and create a plan.
For help: Data Management Trailhead |
Does all of your data to be stored in Salesforce or can it be stored externally and accessed through reporting? | Many Salesforce customers maintain a year of data in Salesforce itself and store additional years of data in a data warehouse, accessing older records by simply running a report. |
Objects that can typically be considered for archiving by .org customers:
In addition to the standard Salesforce objects, nonprofits and Higher Ed institutions have special sets of additional objects that should be taken into consideration during any discussions about data. It’s up to each individual organization to determine what their processes should be around these objects based on their own unique use cases, but they should be included in any properly put together data management strategy:
Higher Ed Organizations:
- Former Students who have graduated, withdrawn or haven’t enrolled in any new classes for a certain amount of time.
- Unaccepted applicants
- Old Course Enrollments
Nonprofits:
- Inactive Donors
- Donations older than a certain time period (e.g. LYBUNT) and their related records
2. Choose a Data Management Tool
Once a data management strategy has been put in place, the next step is to determine the best way to execute it. Salesforce provides a number of tools that can help in this area. Check the table below to find the solution that fits your needs the best.
Data Management Tool | Good for… | Except that… |
---|---|---|
Big Objects |
|
|
Data Storage Optimizer |
|
|
Off Platform Data Archiving |
|
|
Indexes and Skinny Tables |
|
|
Helpful Resources for Your Salesforce Data Management
- Trailhead: Large Data Volumes
- Trailhead: Big Object Basics
- Best Practices for Deployments with Large Data Volumes
- Indexes – Best Practices for Deployments with Large Data Volumes
- Skinny Tables – Best Practices for Deployments with Large Data Volumes
About the author
Tom Leddy is a Principal Customer Success Architect at Salesforce.org. He plays a critical role within Salesforce.org Advisory Services to help Higher Ed and Nonprofit customers accelerate their use of Salesforce technology and best practices.
This blog is part of our larger “Ask an Architect” content series. To learn more about engaging a Salesforce.org Customer Success Architect in your organization, please contact your Account Executive.
You Might Also Like

Salesforce’s Summer ‘23 Release features offer stronger solutions in programs and grantmaking.

A CRM is a customer relationship management tool that helps organizations such as nonprofits and education institutions manage relationships with…

This Women’s History Month, we celebrate the innovative women leaders of the Salesforce Catalyst Fund who are helping to bring…