Now that you have your Salesforce™ org up and running, data loaded, You may have realized that your data is full of duplicates. Read on to understand why duplicates exists and what you can do to mitigate the issue.

Building a client or prospect database has never been easier than with Salesforce™.
In a few clicks you are up and running, entering you first data, sending out your first emails, receiving your first leads etc.

“One thing you can be sure of is that there will be duplicates, so you better plan for it as early as possible”

We work with this every day, and deduplication IS NOT just merging 2 records with the same email, believe us.

Top 5 reasons why duplicates exists and keep accumulating:

  1. Multiple sources for your data (including web-to-lead, user entry, data imports, and synchronizations with back-office or marketing applications). Each source will deliver different spellings, more or less detail.
  2. Users do not search properly before they create a new record
  3. Importing records to salesforce only facilitates matching on a single key (e.g. email)
  4. Web-to-lead forms or other marketing apps, inserts Leads with limited check for duplicates
  5. Lack of analytics to identify and measure the size of your problem.

It is all about: Matching and taking Action on Potential Duplicates.

Matching; is where the technology resides, everything from matching only on emails (as most Marketing Apps will do), to applying sophisticated search and matching algorithms, including Fuzzy logic, Numeric Matching etc.
The first question is: do you need a basic solution or a more powerful solution?

Taking Action on Potential duplicates. This is where it gets interesting…

Automated?, Assisted?, Manual? Merge?, Alert?, Block?
Automation is good for Exact matches.
Mass Merge or other assisted ways of dealing with many duplicates in quick way is a must, but be careful!.
Study your data well and make sure you do not have records with ‘default’ values like unknown@unknown.com, names like: [Not Provided] etc. these examples may be “False” duplicates, and should not be merged.
What about ownership, if both records in a merge is owned by the same user, -fine, thta user will stat owner of the consolidate record. But what if the owners are different, who is the appropriate owner? -And over time you ay have record ownes who have left the company, they should not remain owners of your surviving records.
Blocking duplicates is good for exact duplicates on user entry, but what happens if you block duplicates from a web-to-lead form or an import of new contacts from a tradeshow? Will you be missing out on the opportunity?

Unfortunately, not all records can be treated in the same manner. An effective solution, should provide:
-Options to mass merge obivous duplicate,
-Review options for the more trycky ones,
-Survivorship rules, to help you define which records to keep as the master.

Not all sources can be treated the same way.

Being able to identify and classify your dupes based on e.g. a confidence indicator will allow you to effectively decide which action to take, merge, warn, inform, and thus clean out your duplicates. A confidence indicator will as an example allow you to make sure that you easily can take the obvious duplicates out first, and review the others more thoroughly later, eventually share the information with the record owners for them to decide what data to keep.

Summary:

What you should be looking for to address this is a solution which:

  1. Is easy to setup and to use (no need for manual setup of rules, coding etc.)
  2. Targets duplicates no matter how they get into your database (any data source)
  3. Provides sophisticated matching rules (preferably built-in), and ability to effectively review and merge/purge/keep potential duplicates.
  4. Integrates smoothly with your users daily work e.g. record creation, and facilitates review processes for both data stewards as well as end-users (collaboration), as well as reporting capabilities to monitor and measure your data quality.
  5. Provided the flexibility to accommodate your special needs and requirements though settings or for more advanced usage though customization options.

DataTrim is a Salesforce™ ISV partner who provides deduplication solutions for Salesforce, regardless of how your data enter Salesforce, we have a solution for you, read more at: www.datatrim.com

Learn More about DataTrim Dupe Alerts for Salesforce™

Get a Free trial Dupe Alerts on the AppExchange
DataTrim Dupe Alerts, Trailer (Video) DataTrim Dupe Alerts, Trailer
DataTrim Dupe Alerts – Blog Why Duplicates exists, and more...

Contact Us

Please do not hesitate to reach out to us, we are happy to discuss your needs, and see how our solution can address YOUR challenges.

DataTrim’s Solutions adds experience based data cleaning processes to lead management, marketing automation, customer support and account management processes in salesforce and create valuable impact on the day-to-day usage and productivity in a simple-to-use, collaborative and cost effective way.

We call this the Data Laundry

Sharing is caring!