Interview with Gregg Thaler from RingLead
Here is a recent interview with my new friend, Gregg Thaler, is a self-professed data quality junkie and the Chief Revenue Officer of RingLead. We discussed some best practices in data. I hope you enjoy!
PD: How often is poor data the downfall of a marketing campaign?
GT: Well, if I said every time, that would be a bit of hyperbole, but only a bit. Typically what I find with lists – especially trade show lists – is that they are the Typhoid Mary of duplicate creation in your database. Very often if you think about trade shows, who attends your booth? Many times it’s your customers or existing prospects. And booth staff will scan them indiscriminately. If you import that list without taking preventative measures, then it creates a duplicates horror show.
So, there’s another category of bad lists and those are the lists that are purchased from vendors. And there’s a very fundamental reason why lists purchased from data vendors have data quality challenges. The challenge with contact data is that it ages like fish and not like fine wine. It gets worse as it gets older, not better. Data is foundational. CRM and marketing automation are merely vessels, it is the data they contain that is the true treasure. It’s the single most critical element when it comes to determining revenue success of failure. Everything else you do further down the order of operations, its’ outcome depends completely on the input. What do you get from an automated process when you put garbage in? Garbage out.
PD: What would you say is one of the biggest mistakes you see B2B marketers making when it comes to data?
GT: Well, the mistake that I’m going to describe isn’t really limited to marketers. If you want to really truly recognize the strategic benefits of having optimized data, you have to have a mindset to prevent errors at the source. There’s a simple reason for that. Generally speaking, whatever the cost is in the enterprise to get the data right at the point of creation it’s going to cost you ten times that later to fix it. But then, of course, if you do nothing, the damage caused by inferior data could be much worse, 100x worse is possible.
So often marketers come to us, and their hair is on fire they’ve got to dedupe their database right now. Yes, they’re right. You do have to remediate the situation since almost every single contact database is riddled with duplicate and non-standard data. Often, however, I hear brilliant marketers say something incredibly dumb, they’ll say “…we’ll worry about the prevention later.” Are you kidding me?
PD: Who ultimately owns the data? It is really sales who owns it, or is it really marketing?
GT: That’s a terrific question. Where should data governance reside? Who is the data steward? Traditionally it has been IT. Increasingly we are seeing the data steward role reside in marketing and sales which is where I rightly believe it should belong. Now, what we see in the market, most often the actual people who perform these data janitor-like tasks, are usually in marketing operations, followed very closely by their colleagues in sales ops. Really the best practice is, I believe that organizations should have a data quality center of excellence typically reporting into sales and marketing operations. Even if the COE is one person.
PD: Let’s talk about the best way to boost productivity in marketing.
GT: This has, of late, become a favorite topic of mine. I think it’s something that’s very under-focused on as it relates to peak performance not just for marketers, but throughout the organization. What I’m talking about is first the batch normalization of an org’s data and then the automated enforcement of data standards with technology.
I spoke to two very well-known technology companies. I asked them “how many technologies do you have in your marketing stack?” One responded 35 and another said 22. When you think about the performance efficiencies of all of those applications operating on, what for most people, is a completely non-standard set of data. That contributes to very poor application performance which would then translate into poor performance across the entire marketing stack. It’s a silent killer of revenue and application performance. People probably aren’t aware of how well those applications could perform on a standardized data set. They’ve never seen one. In the benchmarking we’ve done, we’ve seen eye-popping performance gains of up to 600% in duplicate detection in a standardized vs. non-standardized data set. Compound that kind of application performance improvement across the 35 apps that modify that data set and the gains in application performance can be off the charts.
I am completely convinced that data standardization is the single most impactful action, a unique foundation-level enabler that marketers and sales professionals can take to optimize their data for revenue performance and their applications for speed, accuracy and efficient operation.
Leave a Comment