Manufacturing and logistics as industries are remarkably data-rich. Not only are there the basics lists of names, addresses and account details but there is additional inventory and billing information to be managed as well. All this information is vital for the day-to-day running of the business, particularly for manufacturing and logistics companies where data can often be the only link between the physical process on the ground and the planners and decision-makers. Keeping this data clean and up-to-date has therefore always been something of a struggle for the sector, but now more than ever firms need to be sure that the quality of their data is second-to-none.
According to the results of a recent survey carried out on behalf of my company, of IT managers within the manufacturing industry, 100 per cent of those questioned said that data quality is important to the success of their organisation, with over half (56%) rating it very important, and nearly a quarter (23%) rating it as critical. Despite this huge emphasis on the importance of data however, almost one in ten of these companies (9%) said they have no mechanism in place to monitor the quality of their data, while 27 per cent said they rely on ad hoc audits to ensure standards are maintained.
Dirty data as it is called can enter the system of a typical manufacturing or logistics company in a number of different ways. Firstly, there is human error a tick in the wrong box or a mis-spelled name or address. Secondly, and more seriously, there may be technical issues that cause information to be flawed. Data may be accidentally deleted or become corrupted, or updates may not run correctly. Thirdly, and a key cause of dirty data, is deterioration. No matter if the data is correct at the start, customer data typically degenerates at a rate of two per cent a month, or 25 per cent over a year. Thats a very fast turnover, and unless information is updated and cross-checked on an ongoing basis information becomes out of date exceptionally quickly.
Our survey showed that 45 per cent of the manufacturers surveyed do not use any data quality software to protect their data asset. This is despite a clear belief that improved data quality would give them a greater competitive edge (64%). Compliance is another issue, with nearly all the manufacturers asked needing their data to comply with the Data Protection Act, Sarbanes-Oxley or Basel II among other items of legislation. These all, in some form, require greater transparency of financial and other reporting, including accurate record-keeping and auditing of information held. If data quality is poor, companies are leaving themselves wide open to legal action and for senior management, potentially prison.
There are other reasons aside from compliance why the manufacturers surveyed rely so heavily on good quality data. 64 per cent said that poor quality data has had a negative impact on the success of major IT implementations that their company has put in place. In contrast, the survey also showed that when the manufacturing industry gets it right, the results can be impressive. 46 per cent said that implementing a data quality system has had a positive impact on the success of a major IT implementation.
All these different factors demonstrate clearly that if the manufacturing and logistics sectors werent concerned about their data quality before, they certainly should be now. The overall business impact of dirty data even in its least sinister form is that customer service levels plummet, churn rates increase, acquisitions drop, and revenues and reputation suffer as a result. Expensive CRM and call centre solutions become worthless as the data they rely on is often just plain wrong. In an extremely competitive marketplace these are not consequences that any company can afford to face. So how can data be turned from a liability into an asset?
There are four key steps to ensuring long term data quality, based around a continual, integrated process of proactive management. Firstly, it is imperative to know exactly how accurate the data is to begin with. A systematic audit should be carried out to establish exactly where any discrepancies, omissions or duplications lie. The second step is to clean the data, removing errors and consolidating information so that it is ready for use. New technology can carry out this process in days for most databases.
The third and often most challenging step is to keep that data clean. This entails removing the sources of human or technical error as far as possible by applying strict processes and checks and balances to all data management exercises to ensure that the likelihood of incorrect or incompatible data entering the system is kept to a minimum. Finally, the fourth step is to ensure that the systems and the data contained remain compliant and fit for the future. Data quality management must become an integral part of day-to-day business processes, and establishing key compliance criteria for all staff and systems dealing with customer information makes it far easier to maintain.
From the very first step, the long-term benefits of proactive data quality management are tangible and immediate. From a customer service perspective, it ensures that the business is presented as intelligent and capable; that the reputation of the business is protected and often improved, and that the businesss competitive advantage is raised. It also often leads to significant reductions in wastage, costs and staff time; greater ability to make better business decisions based on accurate data; easier and better compliance with industry standards, and an increased ability to focus resources on proactively building the business rather than firefighting.
Good data is fundamental for good business practice, which in turn leads to good revenues and increased competitive edge. Logistics and manufacturing companies, with the masses of information they hold, should be at the forefront of putting this into practice. Data doesnt have to hold a business back with the right processes in place it can easily be turned into profit.
Laurie Mascott is CEO at Datanomic--a leading specialist in end-to-end data quality management and information assurance. The company delivers an innovative data quality software system that has the ability to profile, audit, clean or match data of any type, from any source, with business-application specific real-time error prevention and data compliance solutions.