According to TDWI, bad data cost US businesses $611 Billion in 2013, the US economy over $3 trillion and is estimated to add up to $314 billion to healthcare costs alone (IT Business Edge). The impact on organizations large and small is staggering and contributes to a loss of anywhere between 15-25% of gross revenue for the average company’s bottom line, according to a recent article in IT Business Edge citing Ovum Research.
What’s even more shocking is the fact that many businesses ignore the quality of the data they’re bringing in and the quality of their existing customer data, focusing primarily on sourcing the lowest cost solution at the time. This pervasive, outdated approach has cost companies trillions in lost revenue — so when will businesses become less short sighted?
The problem lies with the market’s perception of data. It’s too often viewed as a commodity and all data is equal, right? Well, even commodities carry distinction, despite popular belief. Econsultancy’s Ben Davis refers to data as “the new oil,” going on to say “data must be polished before it is sent out to the public, just as oil must go through a refining process to become gasoline or kerosene.” If you fill up your car’s gas tank with kerosene, you may have a problem getting to work, just as if you are not maintaining your customer data or bringing in bad data, your approach, will predictably and inevitably, conclude with lost revenue, poor morale and in many cases the dissolution of the business.
54% of marketers cite “lack of data quality/completeness” as their greatest obstacle. Bad data leads to a host of problems including higher consumption of resources and maintenance costs, distortion of success metrics, lower productivity, poor company morale, and loss of revenue – just to name a few.
Only once organizations have truly embraced the fact that data is fluid and constantly changing, will they unlock the value of their customer data, as well as, prospect data they are acquiring from third parties.