| 4 min read

The Cost of Poor Data Quality and How it Impacts Manufacturing 4.0

Critical to the success of any data analytics initiative is focusing on data quality with the goal of having clean data. This means the data must be accurately labeled, free of duplicate records, and blended to generate the correct results. Cleaning up the business’s data quality issues is critical to becoming a truly data-driven enterprise and building a data-driven culture

About this Article

Clean data is especially important when manufacturing processes are controlled, monitored, and measured by automation. We are in the Fourth Industrial Revolution. Data exchange technology and the rise of automated systems in manufacturing industries are spreading to factory floors in ways that parallel the growth of information technology.

The acronym that describes this process is—IioT, the Industrial Internet of things. Technopedia describes the IioT as “a term for all of the various sets of hardware pieces that work together through connectivity to help enhance manufacturing and industrial processes.

Since the emergence of the Fourth Industrial Revolution, significant advances in machine learning, artificial intelligence, and the internet of things have been pushed and pursued in the industry sector. However, many manufacturers still need to work on incorporating advanced analytics into their business and manufacturing processes.

This article will address the causes and costs of poor-quality data—and how to make your data strategy part of your organization’s continuous improvement program.

Data Quality Management: Quantity Doesn’t Always Translate to Quality

The rapidly increasing volume and variety of data from business applications, sensors, third-party sources, and e-commerce transactions add to the challenge. Historically, disparate data from multiple applications and data sources have created data quality issues and, ultimately, bad data. To compensate, data analysts and scientists must apply data cleansing to the data before incorporating it into their analytics dashboards and models. 

bad-data-stressed-employeeMost industry estimates show that it costs $10 per record to clean up bad data and $100 if you do nothing. Moreover, the ramifications of doing nothing will continue to grow. As a result, business users waste time dealing with bad data; data scientists spend excessive time cleaning up the data, and IT must invest in developing processes to keep disparate systems clean.

And most importantly, leaders are left with bad data that drives poor decision-making, with the effects summarized in a 2021 Experian Global Management Research report. According to Experian, the following highest to lowest impacts result from poor-quality data:

  • Wasted resources and increased costs
  • Damage to analytics reliability
  • Harm to customer experience
  • Delays in order fulfillment
  • Negative impact on brand reputation
  • Impeding digital transformation
  • Hindering of regulatory compliance

In short, data quality management is needed. The problem is highlighted by Gartner’s findings that about 20% of data circulating in the ethernet is bad data.

What is Data Management?

According to Oracle, data management is “the practice of collecting, keeping, and using data securely, efficiently, and cost-effectively.” When done within the bounds of the organization’s data governance policy and controlling regulations, effective data management results in the best business decisions and processes (in manufacturing, for example) that benefit the organization. 

How to Determine and Ensure Data Quality

TechTarget defines data quality as “a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability, and whether it is up to date.” So, what are the widely accepted metrics and measurements of data quality?

Data Quality Metrics Include: 

1. Accuracy - the percentage of errors discovered in data records and how that percentage tracks with data integrity rules

2. Completeness - the number of records with missing or incomplete data

3. Consistency how well your data is merged from different sources and combined into a Single Source of Truth.

4. Timeliness - how real-time records are tracked, used and archived according to time stamping to prevent aging of the data set

5. Uniqueness - how the input system copes with and tracks duplicate data

6. Validity - how the data conforms to standards of formatting so that it can be incorporated into the business rules and operations within the bounds of data governance policies 

Data Quality Assessment: A Common Use Case in Manufacturing 4.0

Say that your manufacturing company wants to create a quality dashboard as part of its Continuous Improvement Initiatives. The first step is to consider the amount of data needed to get a complete and accurate view of the company's critical quality metrics (data points) and requirements of data profiling.

However, most companies create and collect large amounts of process data but typically use them for tracking purposes only. So, the analytics dashboard needs to include scrap data, warranty claim data, inbound inspection data, rework data, and more to get a complete and accurate picture of the “quality health” of the company. 

Don’t Rely on Your ERP for High-Quality Business Intelligence Dashboards

Conventional thinking has been that businesses can glean the data for a quality business intelligence dashboard from an Enterprise Resource Planning (ERP) system. However, complete reliance on ERP won’t help much in fusing data with manufacturing. 

That is because much of the detailed data from the shop floor is not captured in the ERP system or stored in the enterprise data warehouse. Instead, this data is processed and stored in multiple, fragmented systems. The key is to break down those enterprise data “satellite silos” and capture the data for the (analytics mentioned above) dashboard.

In turn, breaking down those silos that rely on many data points can result in a replicable, proactive, and consistent approach to data accuracy and myriad related issues. That consistency builds organizational trust in the information and leads to increased adoption across the organization.

Good Data Quality Generates Good Outcomes And Opportunities

In Manufacturing 4.0, production, quality control, and demand planning are among the many functions that manufacturers can enhance through good data quality. Having trusted and complete data improves visibility into manufacturing processes and reduces or eliminates engineering flaws, manufacturing over-and under-runs, product defects, and other quality-related problems.

The opportunity in the previous scenario is its investment in data quality tools and the right skill set, which will allow the business to prep, blend, and cleanse its existing process data with the data in its ERP and EDW systems. As a result, the data can quickly analyze and spot patterns to draw actionable insights and the best business intelligence from the information.

How to Get Your Data Strategy Right to Ensure High-Quality Data

  • Having clean data must be at the core of every manufacturing company’s continuous improvement programs.
  • Implementing a consistent and comprehensive strategy for data quality within your organization is central to transforming your business.
  • The correct data quality management strategy will help you:
    • make better and more informed business decisions
    • maximize the success of your current and future business initiatives.

  • That transformation turns your data into trusted, actionable insights.
  • Investing in the right technology is crucial to maintaining data quality over time. 

Discover how you can take the first steps to establish a data strategy that works. Download our eBook "The Executive's Guide to Building a Data Strategy That Leads to Business Growth & Innovation.

build-a-data-strategy-that-fuels-business-growth