Wednesday, June 8, 2016
Are Duplicate Fields Hurting Your Database Marketing?
A basic requirement of cost-effective direct marketing is elimination of duplicates, meaning multiple database records for the same person or company account. But marketers today need to watch out for another, trickier duplication challenge: duplicate data fields. Expanding use of multi-channel, multi-sourced data fuels the problem. The same prospect or customer record is often enriched by data from different online and offline sources--data appending services, lead-gen services, list rentals, predictive and lead scoring services, e-mail validation services, call center entries, online ad and social platforms, events, and so on. The marketer may intend to validate, update and unify this data, but efforts are delayed or incomplete for whatever reason. Then the marketer gets ready to launch a campaign and discovers many contact records have two job titles or four industry codes or three e-mail addresses. If the data entries are not clearly dated and sourced, the marketing team has no clue which data are the most up-to-date, accurate and appropriate for targeted promotion! We were pleased to see a recent MarketingProfs article by Ed King, CEO of the data automation firm Openprise, offer cogent advice on avoiding this costly problem. Obviously, marketers should first strive to unify field content promptly while the data is fresh. If data unification must be delayed, new data should be labeled by its source and age for use in future data consolidation decisions. Whether field data unification is immediate or delayed, the marketing team needs to agree on a data-unification logic. King advises that this logic should be based on at least three key factors: source authority (giving priority to trusted data sources); source focus (preferring sources more aligned/specialized for the marketer's industry/target); and age of data (for example, in B2B, more recent contact name or company size is likely to be more accurate). Consistency and scalability are the goals; ad hoc, manual record decisions are not only less efficient but less likely to yield optimal overall results. While unifying data in fields, the database process should also normalize data so formatting and coding are consistent. Plus, a smart database effort can remap field content for better targeting. King provides the example of consolidating 2,000 industry codes into 10 custom definitions that better fit market targets. For the whole article, see http://www.marketingprofs.com/articles/2016/30069/your-duplicate-data-problem-has-an-evil-twin-that-is-much-worse-duplicate-fields?adref=nlt060816