• DATVF.ATLPHL
    1.751
    -0.063
    -3.5%
  • DATVF.CHIATL
    2.041
    0.007
    0.3%
  • DATVF.DALLAX
    0.928
    0.007
    0.8%
  • DATVF.LAXDAL
    1.459
    -0.043
    -2.9%
  • DATVF.SEALAX
    0.984
    0.022
    2.3%
  • DATVF.PHLCHI
    1.110
    0.019
    1.7%
  • DATVF.LAXSEA
    2.155
    0.009
    0.4%
  • DATVF.VEU
    1.634
    -0.013
    -0.8%
  • DATVF.VNU
    1.466
    -0.005
    -0.3%
  • DATVF.VSU
    1.194
    -0.017
    -1.4%
  • DATVF.VWU
    1.569
    0.015
    1%
  • ITVI.USA
    9,394.010
    -295.340
    -3%
  • OTRI.USA
    7.540
    -0.110
    -1.4%
  • OTVI.USA
    9,375.560
    -302.450
    -3.1%
  • TLT.USA
    2.730
    0.000
    0%
  • WAIT.USA
    156.000
    -2.000
    -1.3%
  • DATVF.ATLPHL
    1.751
    -0.063
    -3.5%
  • DATVF.CHIATL
    2.041
    0.007
    0.3%
  • DATVF.DALLAX
    0.928
    0.007
    0.8%
  • DATVF.LAXDAL
    1.459
    -0.043
    -2.9%
  • DATVF.SEALAX
    0.984
    0.022
    2.3%
  • DATVF.PHLCHI
    1.110
    0.019
    1.7%
  • DATVF.LAXSEA
    2.155
    0.009
    0.4%
  • DATVF.VEU
    1.634
    -0.013
    -0.8%
  • DATVF.VNU
    1.466
    -0.005
    -0.3%
  • DATVF.VSU
    1.194
    -0.017
    -1.4%
  • DATVF.VWU
    1.569
    0.015
    1%
  • ITVI.USA
    9,394.010
    -295.340
    -3%
  • OTRI.USA
    7.540
    -0.110
    -1.4%
  • OTVI.USA
    9,375.560
    -302.450
    -3.1%
  • TLT.USA
    2.730
    0.000
    0%
  • WAIT.USA
    156.000
    -2.000
    -1.3%
American Shipper

Bad data ? wrong decision

Bad data à wrong decision

Good business decisions begin with good data, and there is no substitute for it.


      It never ceases to amaze me that companies make bad business decisions based on inaccurate or incomplete data. It's even more surprising to find the companies know their data has holes or is inaccurate but have not done anything to fix it. Or if they have, it's a one-time fix, with the assumption that nothing will ever go wrong ever again in the data, and hence never have a sustainable process in place to ensure the data remains clean and complete.

      Many times when we are on engagements at different companies, we also find that the company is investing millions of dollars in new technology, processes, skills and competencies of their functional teams, but not doing anything about the data. Companies must learn that good decisions, efficient processes and practices, improved visibility, and business collaboration all begin with good data, and enough of it.

Case in point

      Some of our clients have hired us to create decision models and tools to improve the quality of decisions and processes. We struggle to make the best use of the available data, but frequently we see that our models are not delivering their full potential due to low quality data.

      On the other hand we see people consistently making important decisions based on unreal information. Imagine an entire company that lives each month in a fantasy world in order to figure out how to deal with daily operational challenges. We are talking about companies that have implemented million-dollar enterprise resource planning systems, spent a lot of time and effort on process adjustments, training and implementation and most of the time do not even acknowledge to have a problem with data quality.

      But where does this issue stem from?

      Not knowing where to look. Many system implementations leave behind users who know how to function in their daily jobs, but may not be sufficiently familiar with the system's parameters. When not set correctly, these parameters may create 'noise' in the data. For example, having the wrong number for a field like Minimum Order Quantity (MOQ) may result in consistent over/under ordering, leading to bad service or high stock.

      Insufficient training on data management. The person extracting the data could be doing a sloppy job. Handling data manually is delicate work, but many people do not realize that. A small mistake on a database query or on a data lookup in a spreadsheet can create huge anomalies on data analysis.

Lack of ownership and process. Many companies have entire departments or functions that deal with Master Data Management (MDM), and it's for very good reasons. Imagine a situation where, when a new stock keeping unit (SKU) item is set up, it is done so with incorrect or incomplete information. Some of the information could be vital, such as the item number, weight, volume or the formulation. Without these critical data items being correctly input, the entire supply chain could be running off of bad numbers, which can create havoc. When no clear ownership exists for each field of the master data, there is chaos. There must also be a sustainable process in place for creation, entry, maintenance, audit and deletion of data from the systems.

Solutions and approaches

      Address this problem with a three-pronged approach:

1) Perform a system-wide audit of information to understand the root causes of data errors.

      ' Encourage your company's whiners, and get them to show you where all the errors are, and have them help you track down the root causes.

      ' Make a complete mapping of where the data comes from and goes to, and pictorially figure out where you need to focus your attention.

      ' The diagram will also help staff envision the volume of data that the company has to deal with and where it all gets used (if you know who's using the data you put out, there's a better chance you'll be careful with it).

2) Set up a formal process, ownership and metrics for ensuring accuracy and completeness of data.

      ' Create processes of which data gets entered or manipulated or audited in each step.

      ' Create flows of how an item gets set up, how a formulation gets changed, how frequently order quantities get updated, how often vendor lists are generated, etc.

      ' Along with the flows, mark with a different color each function that owns each part of the flow, and put it up on the walls and corridors, so that people are aware of where to go if they encounter bad data in each part of the chain.

      ' Set up metrics for data cleanliness ' again, use the whiners to track this and monitor it informally, so as to ensure the 'fairness' of the metrics. Use data quality and update frequency as quantifiable and measurable steps to success.

3) Create policies for handling, manipulating and using the company information.

      ' Set up master-source rules for each report and metric that a company uses; there can be only one master-source for 'inventory days on hand,' and that source should be used to feed/populate all other sub-sources; there can be only 'one' real number in the system.

      ' Clarify how analyses should be performed and how each field is calculated, so that there are no competing definitions of metrics or standards.

      ' When there are manual data manipulations as part of the process, ensure comprehensive documentation of how these manipulations should be done, in a step-by-step manner, so as to create repeatability and ensure consistency of process.

      Remember, good business decisions begin with good data, and there is no substitute for it. Spend the time and effort

to ensure accurate and complete data in your business, so that people are talking the same language, looking at the same numbers, and there is only one version of the truth.

   Deep R. Parekh is a partner with Equus Group LLC, a supply chain advisory services and management consulting firm based in New York and Sao Paulo, Brazil. He welcomes your feedback and comments at deep.parekh@equusllc.com, and can be contacted at (917) 940-7538.

Show More
Close