Recently I had the pleasure to chat with Alex Tapscott, founder and CEO of Northwest Passage Ventures and coauthor—with his dad, the famed business theorist Don Tapscott— of the new book, “The Trust Protocol: How Blockchain Technology Will Change Money, Business and the World,” just published by Penguin’s Portfolio imprint. Together, Alex and I delved into the thorny issue of virtual currency, the uses of blockchain to make financial transactions, and what’s to come.
Here, Alex and I discuss some unresolved issues about cryptocurrencies, perhaps most importantly about who’s minding the store. Heading off hacker thieves is critical, as is—not surprisingly—satisfying regulators that such financial transactions via the various types of blockchains are indeed valid.
It’s not been easy, and that has led to a problem of trust. Alex mentioned to me Mt. Goxand Silk Road, two of the most notable scandals impacting digital currency, which hurt its early reputation badly, but he didn’t have to go back a couple of years. Most recently the Hong Kong bitcoin exchange Bitfinex said it had some $72 million stolen in a serious hack—an amount that has slumped, due to loss of confidence in the currency.
“Yes, there are a lot of issues unresolved right now, and many of them need to be solved in order for blockchains to reach their potential,” Alex said. “The questions of interoperability and scalability certainly are important, but so is the question of law enforcement. How do we make sure that criminals don’t use this, or if they do that we’re able to stop them? And they’re all governance questions really.”
Alex and I discussed the basic question: Who is going to lead and who is going to govern this brave new work of cryptocurrency. Will it be government or the industry itself?
“It’s a governance network, but with a small ‘g,’ ” Alex noted. “Now, that doesn’t mean that there isn’t an important role for government regulation. Regulators are really-critical stakeholders when you’re talking about things of value, like financial assets or money. They’ve always had a strong role in the financial services industry and they will continue to have one. But many of these issues are outside of their expertise. The ecosystem needs a standards network.”
Alex and I discussed the Internet’s standards network, the Internet Engineering Taskforce, that comes up with specific issues on HTML, HTTP, XML and other different protocols. Now, Alex said, the same thing needs to happen with blockchain transactions to enable the full utilization of digital currency.
“It’s going to require people from all these different siloes, Ethereum, bitcoin and private blockchains among them, to begin to discuss and communicate with each other. Technical standards are just one of many different issues that need to be resolved. There’s going to be a need for standards networks for everything from smart contracting, to title and deeds, to standards for financial assets. All these things are still left to be resolved.”
My conversation with Alex about cryptocurrencies was wide-ranging and intriguing. As he said to me about the future of digital monies, “A lot of banks and companies and governments are waking up to the potential of this technology. They love the idea of frictionless payments, of secure networks, of lower cost, of better speed.”
The world of crypto currency continues to evolve. Let’s see what’s next in this brave new world. Will all our payments be in crypto money? In future posts, we’ll delve into a lot more about the future of cryptocurrencies, including more about regulatory, trust and security aspects. Hope to see you there!
Most marketers I talk with today say they are drowning in data. But in reality data they really want sits in disparate systems throughout the organization. Or if their company has invested big money in a traditional data warehouse, the results have fallen short of expectations. This is because traditional data warehouses were never designed to handle the volume, variety and velocity of today’s data-centric applications. So, while most marketers and most companies “talk” about big data they just go on with “business as usual” taking little or no action.
This is not just my opinion. Recently, my UK colleague, Richard Petley, director of PwC Risk and Assurance, conducted a survey of 1,800 senior business leaders in North America and Europe. And only a small percentage reported effective data management practices. 43 percent of companies surveyed “obtain little tangible benefit from their information,” while 23 percent “derive no benefit whatsoever,” according to the study. That means three quarters of organizations surveyed lack the skills and technology to use their data to gain an edge on competitors.
The problem is not access to data. It’s the management of it. What companies really need is the ability to manage large amounts of data in a safe, agile and adaptable fashion. And that means they need a more modern data warehouse.
The overall purpose of a data warehouse is to integrate corporate data from various internal and external sources. Implementing a data warehouse is traditionally a long, costly and risky process. When the solution is ready, it’s often slow, outdated and hard to update as business changes. A modern data warehouse is different – employing new technologies, products, and approaches. Approaches that allow for both speed and agility.
With a modern data warehouse, you only have to query one source to get the data you need. When you add automation to the mix, you can load, clean, integrate, and format the data in record time.
POSSIBLE is a creative agency that brings results-driven digital solutions to some of the world’s most dynamic brands. Every two weeks, analysts faced the herculean task of reporting campaign results based on 10 different data sources, applying 20 different measures on 70+ products delivered by 100+ media partners. They would spend on average 35 hours just processing data before they could begin analysis.
When I spoke to the POSSIBLE team they reported this free demo introduced them to a data warehouse automation tool from TimeXtender. After a surprisingly fast implementation period, the “data munging” performed by POSSIBLE’s analysts has now been reduced 68%. “This has really turned out to be a big win for us. The fact that we can now get actionable data to analysts so much faster allows us to spend more time providing valuable insights to clients,” says Harmony Crawford, Associate Director of Marketing Sciences.
As my colleague Richard Petley likes to say, “Data is the lifeblood of the digital economy.” It can provide insight, inform decisions and deepen relationships, and drive competitive advantage, but only if it’s managed in an agile and adaptable way.
So the next time you find yourself complaining about the problem with big data, stop talking and start researching the modern data warehouse and data warehouse automation.
The year 2016 is gearing up to be a game-changer in the realm of marketing and across several categories, and if one wants to survive in this ever-evolving landscape, it’s vital to take a close look at what’s coming. From content promotion experimentation by content marketers of all kinds to video ads driving engagement, personalization, relevancy and more, there are a number of marketing “influencers” – the “movers and shakers” of this industry – that have made their predictions regarding the burgeoning world of marketing communications and what they feel should be focused on in 2016.
We queried several of these marketing influences for their thoughts and predictions on the direction of marketing in 2016. Their responses concentrated on ve important perspectives:
- Content Continues to Evolve
- Focus on Personalization
- Gathering Good Data is More Important Than Ever
- Knowing Your Audience is Key
- Look Outside the Marketing Box
Take a closer look at each one here 2016-predictions-from-10-top-influencers
Here is a recent interview with my new friend, Gregg Thaler, is a self-professed data quality junkie and the Chief Revenue Officer of RingLead. We discussed some best practices in data. I hope you enjoy!
PD: How often is poor data the downfall of a marketing campaign?
GT: Well, if I said every time, that would be a bit of hyperbole, but only a bit. Typically what I find with lists – especially trade show lists – is that they are the Typhoid Mary of duplicate creation in your database. Very often if you think about trade shows, who attends your booth? Many times it’s your customers or existing prospects. And booth staff will scan them indiscriminately. If you import that list without taking preventative measures, then it creates a duplicates horror show.
So, there’s another category of bad lists and those are the lists that are purchased from vendors. And there’s a very fundamental reason why lists purchased from data vendors have data quality challenges. The challenge with contact data is that it ages like fish and not like fine wine. It gets worse as it gets older, not better. Data is foundational. CRM and marketing automation are merely vessels, it is the data they contain that is the true treasure. It’s the single most critical element when it comes to determining revenue success of failure. Everything else you do further down the order of operations, its’ outcome depends completely on the input. What do you get from an automated process when you put garbage in? Garbage out.
PD: What would you say is one of the biggest mistakes you see B2B marketers making when it comes to data?
GT: Well, the mistake that I’m going to describe isn’t really limited to marketers. If you want to really truly recognize the strategic benefits of having optimized data, you have to have a mindset to prevent errors at the source. There’s a simple reason for that. Generally speaking, whatever the cost is in the enterprise to get the data right at the point of creation it’s going to cost you ten times that later to fix it. But then, of course, if you do nothing, the damage caused by inferior data could be much worse, 100x worse is possible.
So often marketers come to us, and their hair is on fire they’ve got to dedupe their database right now. Yes, they’re right. You do have to remediate the situation since almost every single contact database is riddled with duplicate and non-standard data. Often, however, I hear brilliant marketers say something incredibly dumb, they’ll say “…we’ll worry about the prevention later.” Are you kidding me?
PD: Who ultimately owns the data? It is really sales who owns it, or is it really marketing?
GT: That’s a terrific question. Where should data governance reside? Who is the data steward? Traditionally it has been IT. Increasingly we are seeing the data steward role reside in marketing and sales which is where I rightly believe it should belong. Now, what we see in the market, most often the actual people who perform these data janitor-like tasks, are usually in marketing operations, followed very closely by their colleagues in sales ops. Really the best practice is, I believe that organizations should have a data quality center of excellence typically reporting into sales and marketing operations. Even if the COE is one person.
PD: Let’s talk about the best way to boost productivity in marketing.
GT: This has, of late, become a favorite topic of mine. I think it’s something that’s very under-focused on as it relates to peak performance not just for marketers, but throughout the organization. What I’m talking about is first the batch normalization of an org’s data and then the automated enforcement of data standards with technology.
I spoke to two very well-known technology companies. I asked them “how many technologies do you have in your marketing stack?” One responded 35 and another said 22. When you think about the performance efficiencies of all of those applications operating on, what for most people, is a completely non-standard set of data. That contributes to very poor application performance which would then translate into poor performance across the entire marketing stack. It’s a silent killer of revenue and application performance. People probably aren’t aware of how well those applications could perform on a standardized data set. They’ve never seen one. In the benchmarking we’ve done, we’ve seen eye-popping performance gains of up to 600% in duplicate detection in a standardized vs. non-standardized data set. Compound that kind of application performance improvement across the 35 apps that modify that data set and the gains in application performance can be off the charts.
I am completely convinced that data standardization is the single most impactful action, a unique foundation-level enabler that marketers and sales professionals can take to optimize their data for revenue performance and their applications for speed, accuracy and efficient operation.