The Five Data Chasms:  Translating Data Into Business Results


By Mischa Dick, Managing Partner, ReInvent Work

We have all heard the saying, “we are data rich, and information poor”.  And as sad as it sounds, that is an understatement of the real problem of turning the data into business results that are measurable and get the stamp of approval of the CFO. 

There are the data masters of the world, Google and Facebook, who know how to monetize data.  But that is not who we are talking about.  This conversation is about the manufacturing business, local or regional healthcare provider, or consumer goods business, one where data itself is not the customer value proposition.  Rather, data should be an enabler to operate more efficiently and effectively.  And for these businesses, the challenge of getting from data to sensible action is big, and more tech is often not the answer.  In fact, the technologies themselves are often more than adequate; it is not what separates the successful from the struggling; it is how the technologies already in place are taken across the chasms to enable results.

The magic lies in closing these gaps along the way; they are often invisible along the information continuum.  These chasms are usually not within the defined scope of the responsible parties along that journey.  As a result, good technology solutions get a bad reputation, IT departments are seen as not helpful, operators are frustrated with the lack of actionable information they can use,  and the CFO is in disbelief that progress towards improving financials is so slow.

It is the journey across 5 chasms that gets us from data to actual business results, a journey that needs to be taken in succession:

1 – The Data Integrity Chasm

There is the usual challenge of inducting or connecting all the relevant data elements, normalizing them, and overcoming the data processing challenges of disparate sources of information.  And there is a challenge of transforming various types of data preferably into a discrete format that can be analyzed. 

Those are the technical challenges ETL analysts are well trained to deal with and resolve. 

The true data integrity chasm is created where more profound business knowledge has to be applied to the data to determine if it makes sense, a higher level of gap analysis.  The IT-based analyst charged with the data access, import, and instantiation will keep an eye on the technical integrity of the data, but has little insight if the data makes reasonable sense, to begin with.  The first check should be an internal data referential check, do the bits add up to the whole?  Many transactional data have content that can be added up to see if the aggregations can be recreated from the detailed transactions, and often this will be the first clue of potential issues.

The next level of checks takes someone close to operations, with a deeper knowledge of how the business works, to see if the data makes sense in the context of the business.  But the individuals qualified to make this determination are often not equipped to manipulate large amounts of data, assess it, and have the liberty of “playing” with it.  Also, they are not typically engaged at this early stage. 

If any “common sense issues” exist in the data, they are not involved until the data is presented to them in an analytics platform itself, which is too late.  At this stage, it is much harder to correct the issues, and, there is instant credibility loss with the user community.

The way to overcome this chasm is to have a business operator involved early on.  The data should be aggregated at various levels and taking various points of view to see if there are any common sense violations – and there will be almost every time.  Then a detailed audit list of the worst offenders will quickly determine where the issues in the initial data lie . The best audit lists for this purpose are not generated randomly but are highly biased.  The most useful approach is to take the extreme edge cases and step the assessment back down to the individual transactional data set. 

For example, a health plan may be looking for opportunities to improve patient care while simultaneously making sure the health plan is properly compensated for the risk they are assuming.  The risk indicators from diagnosis codes on the claims should parallel findings of service utilization.  If a large segment of patients shows high risk based utilization, yet the health plan does not see any diagnosis that aligns with the utilization, something is likely amiss in the data.

The biggest challenge in overcoming this chasm is that it takes a curious mind that has no particular outcome in mind to review the data.  This process is organic, where a few pointed questions are asked and answered using the data, which will then lead to cascading questions, all based on a deep knowledge of the particular business.

2 – The Question Chasm

Once the data passes the reasonable test, it is time to test how it answers fundamental questions.  In the end, analytic solutions need to answer questions, which lead to actions that change the operational reality.  But asking the right questions is hard.  We have become used to using convenience data for years and often do not step back far enough to determine what question we should be answering.  We keep answering the same questions, getting the same answers, and not advancing performance.  Asking the right questions is probably the most challenging part of making good use of analytics.  It requires taking a step back, figuring out the process’s true purpose, and how to best measure how well the process delivers its purpose.  All processes convert inputs to outputs, and the most valuable measures are results to resource ratios, very much like the miles per gallons measurement for fuel efficiency.  Often the measures in place have not been challenged for years, and it is time to step back and rethink.  Once the measures are determined, the right questions have to be formulated to get to the heart of potential performance issues.

3 – The Interpretation Chasm

The interpretation chasm is one where the user can’t see the forest for the trees and misinterprets the information.  Most analysis seeks to group the data based on some underlying potential driver or detractor in performance, for example zip codes, income levels, or diagnosed disease.  If so, there may be a way to use that knowledge to improve outcomes.  The issue can be, that we have not found the right groupings, and thus we will miss the signal altogether.  A great example of this is in population health.  We may subgroup the data by disease and target patients with certain diseases for disease management programs.  What we may miss are more global underlying factors, like socio-economic realities and the clustering of multiple diseases.  We may be much more successful in creating combinations of socio-economic groups with multiple diseases and create a treatment approach that is not so much disease specific; infact, psychographic clustering and management may be much more useful.

4 – The Right Action Chasm

Once we understand the causes for potential performance or outcome issues, we have to formulate the correct actions.  Of course, the actions must address the identified root causes to make a difference in performance. 

In quite a few organization’s activity, being busy, and doing things, has become the key measure rather than taking the right action that is based on the data driven findings.  The biggest challenges in identifying the best plan of action are usually a result of Chasms 1 to 3. However, it is still worth stepping back and creating the hypothesis of what we expect to have to happen if the action is taken. 

Asking the simple question, “when we take this action, what change in outcomes do we expect?” gets many to rethink the action and see it differently.

5-  The GRD Chasm

And lastly, we have the GRD – Git’R Done – chasm.  Most of us are so overwhelmed by day-to-day activity, that actually taking the actions we identify has become a real challenge.  Many discussions are had and meetings held, agreements reached, yet, the actions are not taken, and thus no outcomes are achieved.  This, of course, has nothing to do with technology, but it will impact the value interpretation of the technology solution.  No action means no business results.  Technology is only as good as the action it will drive, and thus this becomes a concern for the IT groups – the technology has to provide information in such a way that the resulting actions can be executed by the organization.

With analytics platforms and solutions becoming increasingly commoditized, it may require technology companies and internal IT resources to close these chasms.  As long as they exist, the value of the technology will be limited, and the limitation will not be due to the technology itself but the ability to bring the knowledge to life.