Blog

Data quality and Jerry Maguire


by Victoria Thomason

10 Apr 2014

Show me the money – the best argument for putting forward a data quality business case

This is a great article written by Andy Hayler that shows how building a business case for data quality is not rocket science, it just involves taking a simple look at existing business processes and checking the cost of errors that are today occurring in these processes due to poor data quality.

The Information Difference regularly conducts surveys on data management. Data quality is a frequent topic in these surveys, which the company has been carrying out for eight years now. A consistent theme over this time has been that the number one barrier to better data quality has been “management do not see this as an imperative”, followed closely by “it is difficult to present a business case”.

Since senior management typically only get engaged in things that either increase sales, reduce costs or keep them out of jail, these two barriers would appear to be intimately linked. With this lack of engagement from management, it is perhaps not surprising that those same surveys show that as many as 30% of companies do not use data quality tools at all, and that even among those who do, the penetration of such tools is weak, with only a third of companies having data quality initiatives spanning the entire enterprise.

Although data quality may not be the sexiest topic, it should not in fact be that difficult to construct a business case. Start with a business process like “sell to customer” or “develop new product”. Usually processes like these are broken down into smaller steps, such as “bill to customer”.  In a 2011 study The Institute of Management and Administration reckoned that, across all industries, 2% of invoices have errors, with a typical finance clerk dealing with around 2,500 invoices per month, and a typical cost per invoice of a little over $10. A company with 5,000 employees will typically employee a dozen accounts payable staff, so in a year errors in invoices are costing such a company $72,000 a year (clearly for a really large company the figures will be greater). A data quality initiative that halved error rates in such a case would save $36,000 a year. Using some reasonable assumptions for a discount rate (18%) and on-going costs of 10%, a data quality initiative costing $70,000 would have a positive net present value over three years and an attractive IRR.

You can take the same approach to any business process, many of which will typically have significantly higher costs associated with them than mere invoice processing. Error rates may be much higher in other processes, and the costs greater. A number of authors such as Tom Redman and Larry English have published articles claiming that poor data quality costs a company between 8% and 25% of operating revenue, though hard data on the subject remains elusive. Even allowing for some self interest from authors in the field, who may benefit from alarming figures for poor data quality through increased consulting work, it can be seen that the potential for generating a business case is considerable.

Suppose that the true figures for data quality costs were much lower, at 2% of revenues, in line with the well-researched invoice error rates mentioned earlier. If you could halve the errors associated with poor data quality through well-targeted application of technology and processes, then you would save costs equivalent to 1% revenue for a company. This may not sound a lot but is actually huge for serious sized companies. For example, the same company with 5,000 employees would, if it was a typical medical device company (revenues per employee vary by industry), have revenues of $1.25 billion, so a 1% saving means $12.5 million of annual benefit. In such a case a data quality initiative achieving 50% improvements could cost $25 million (plus say 10% on-going costs of implementation when finished) and still have a positive net present value and healthy IRR using some reasonable assumptions. $25 million buys you an awful lot of data quality software and consultants to implement it.

Dodgy data quality can have all kinds of costs beyond the basic one of wasted staff time. In 2012 Prudential Insurance in the UK was fined £50,000 for a basic data quality error whereby the merging of two customer records where the customers had the same name and birth date, resulting in funds owed to one customer being sent to the other. This fine was due to a solitary error involving just two of the company’s six million customers.

Hopefully these examples show how building a business case for data quality is not rocket science, it just involves taking a simple look at existing business processes and checking the cost of errors that are today occurring in these processes due to poor data quality. Sometimes the sums of money will be small, but as was seen, the monetary value can be considerable, especially in larger companies. If you prepare a solid business case using this sort of technique, then you should have no difficulty in engaging senior management as to why they should be investing in data quality. As Tom Cruise’s character memorably put it in the 1996 movie Jerry Maguire: “show me the money”.

You can follow Andy Hayler here: http://www.cio.co.uk/blogs/data-management/data-quality-jerry-maguire/