Riddle Me This Batman

News Upcoming Webinars Trade Shows and Events Press Releases Newsletters Blog
Content Optimization - Concept Searching

Riddle Me This Batman

An Experian report found that 88 percent of companies see the direct effect of inaccurate data on their bottom line, losing an average of 12 percent of their revenue. In a similar study about database marketing, organizations estimate that they could increase sales by 29 percent with corrected customer data – source: Internal Results.

What’s wrong with this picture? According to a report with analysis from Paxata, Accenture, and Microsoft: 

  • Almost 40 percent of respondents feel that data quality is still a problem 
  • 72 percent are investing more than $2 million in artificial intelligence (AI) projects
  • Gartner predicts that through 2022, 85 percent of AI projects will deliver erroneous outcomes due to bias in data, algorithms, or the teams responsible for managing them 

I would like to bring up data quality. Specifically, if your organization is one of those willing to shell out $2 million for an AI project, when you already know you have a data problem. There is an 85 percent chance it will fail, so it might be money well spent to clean up your data first. 

Do most organizations use tools? Nope, they use Excel – well, I guess Excel could be called a tool. Interestingly, about 37 percent of any organization’s data comes from external, second-party, and third-party sources. How is that vetted for accuracy?

Plus, there is a healthy mix of structured and unstructured data that must meet organizational quality standards. It’s a tad easier to identify data quality issues in structured data, whereas it is pretty much a shot in the dark when validating unstructured data. Whose responsibility is it to take charge of this mess? The IT team or the business owner? Dare I say, both – we will be going around in circles forever. 

What are some of the common data problems?  

  • Poor information retrieval – not knowing the correct terms to retrieve the information you need, staff generally spend 30 percent of their time just looking for the right data, and, even worse, in 40 percent of searches people never find the data that they were looking for in the first place 
  • Unusable and/or inconsistent data
  • Duplicate data can impact revenue faster than any other data issue 
  • Poor data security – 20 percent of people say that they would never consider doing business again with a company that failed to handle their data in a professional and secure manner 
  • Poorly defined data – data in the wrong field or filed incorrectly 
  • Incorrect data – data decays at a rate of 2.2 percent per month, and 10 to 25 percent of data has errors within it 

A data quality initiative isn’t for the faint of heart. Data capture standards, quality controls, integration of data types for analysis – structured, semi-structured, unstructured – identification of duplicates, obsolete data, and compliance are just a few of the ways to tackle and solve data quality issues. Chances are, you already have a pretty good idea where your weaknesses exist. And don’t forget user error. Human error is still the main cause of inaccurate data, according to Experian Data Quality’s 2017 global data benchmark report.

In the field, we are hearing more and more from prospects and clients about cleansing ‘legacy’ content, partially driven by an increasing amount of data being migrated to the cloud. Organizations used to think they were ok, when data was out of sight, out of mind. Read more about us in this file analytics report by a leading research firm. We can’t help you with all your data quality issues, but we can make a huge impact. Check out our content optimization solution.