Our research consistently finds that defects in data are the root cause of a wide range of problems encountered by modern corporations. The magnitude of the problem correlates with the size of the company: Big companies have bigger headaches than midsize ones. Data issues diminish productivity in every part of a business as people struggle to correct errors or find workarounds. Issues with data are a man-made phenomenon, yet companies seem to treat bad data as some sort of force of nature like a tornado or earthquake – something that’s beyond their control to fix. At best they look for one-off workarounds and Band-Aids without tackling the root causes or recognizing the need to keep data issues in check. Data stewardship can and should be a part of a disciplined approach to management in the same way organizations implement quality control, cash management and legal compliance.
I recently gave a presentation at an analytics conference about using advanced analytics in the finance function. Advanced analytics includes techniques such as predictive analytics (a topic covered in depth by my colleague Tony Cosentino), simulation and optimization, which companies can use to make better-informed decisions, increase their awareness of positive or negative developments in their business and become more agile in responding to opportunities or threats. But advanced analytics – indeed any analytics – require that accurate data in the proper form and format be readily available.
Unfortunately, in almost all companies, data quality does not get the attention it needs and information management benchmark research found only 13 percent have completed data quality initiatives. (The exceptions include companies for which clean, accurate data is critical to their business model, such as web-based ones.) In most, no one is responsible for data and no serious ongoing programs for data stewardship and data governance. There is no executive-level focus on relentlessly working to keep data clean the way that, say, defect-free production might have. Like the weather, people complain about data but no one really does anything about it.
In every organization, management expects the data people use to be accurate and accessible, but we find that remarkably little is done to ensure this. Companies must address two broad issues before they can improve their data environment. Most obviously, they must have the right processes and tools in place to eliminate the root causes of bad data and deal with data quality remediation efficiently. But the other issue is people-oriented. Who’s responsible for good data? In most organizations the answer is “nobody,” or if it is somebody the person has neither the authority nor the tools to do much about data quality. Corporate leadership must keep attention focused on the value of data quality to the smooth operation of the business. The organization has to accept the idea that the issue is important and that a solution to the problem exists. Even as I write this, though, I wonder to what extent this sort of change is possible. They don’t teach data quality in MBA programs.
The trend toward big data puts a spotlight on data quality management. Without quality data, as I’ve noted, organizations will wind up with just a bigger, less manageable version of the garbage-in-garbage-out syndrome. Advanced analytics have the potential to give companies a sustainable competitive advantage, but this can’t happen if the analytics are built on data that is inaccurate, requires excessive amounts of grooming or is not timely enough to be commercially valuable. While automation plays a part in data management, attitudes and management are equally critical to improving the quality and timeliness of data.
Robert Kugel – SVP Research