Summary of “Only 3% of Companies’ Data Meets Basic Quality Standards”

Most managers know, anecdotally at least, that poor quality data is troublesome.
Bad data wastes time, increases costs, weakens decision making, angers customers, and makes it more difficult to execute any sort of data strategy.
The method is widely applicable and relatively simple: We instruct managers to assemble 10-15 critical data attributes for the last 100 units of work completed by their departments – essentially 100 data records.
This number, which can range from 0 to 100, represents the percent of data created correctly – their Data Quality Score.
No manager can claim that his area is functioning properly in the face of data quality issues.
Still, most find a good first approximation in the “Rule of ten,” which states that “It costs ten times as much to complete a unit of work when the data are flawed in any way as it does when they are perfect.” For instance, suppose you have 100 things to do and each costs a $1 when the data are perfect.
Bad data is a lens into bad work, and our results provide powerful evidence that most data is bad. Unless you have strong evidence to the contrary, managers must conclude that bad data is adversely affecting their work.
While some data quality issues are unfathomably complex, many yield quickly and produce outsize gains.

The orginal article.