Our Sites

Improving product quality with data analytics

Defining data flow encourages efficiency throughout metal stamping shops

Manufacturing companies need to document the flow of data in their organizations.

Documenting the flow of data through your organization will help you understand the source of the data and its transformation, allowing you to make better decisions from its analysis.

Each activity on your stamping floor is designed to turn a flat piece of metal into a useful component. Each part of your process moves, modifies, or stores your raw material, work-in-process, or your final component. A quality component meets its specifications, and reliable materials are needed to form quality components. Inputs to and outputs from processes are defined to measure and control each process. The more you understand and manage each process and component, the more efficiently your plant operates. The same is true with the data you manage during production.

One of your most important manufacturing tools is the flow of data through your organization. If you think of your plant as a living organism, your equipment and people are like body organs that convert energy and raw material for a result. Electricity and gas are the blood that fuels each activity. Data provides the nervous system that directs and controls that activity. The efficiency of your production operation and value of your administrative activities can be directly influenced by the quality of your information and how well it is shared and understood.

Small Activities, Big Impact

The information you collect and process talks to you, whether you use artificial intelligence or your own analysis. In today’s stamping plants, seemingly small activities can have a significant impact on both your financial and production results. A string of frequent failures traced directly to a coil or lift and a mill certification allows you and your suppliers to understand how your processes respond to variability in metal properties. A scheduled cleaning or changing of a lubrication nozzle can improve forming results and reduce the cost of consumables. A tonnage monitor can demonstrate and imbalance or fatigue failure in a press component.

These and many similar activities can be measured and their results discovered if you allow them to leave digital footprints. Those footprints are the data that supports continuous learning and improvement.

Big Data

Much of the manufacturing technology talk today includes Big Data and data analytics. Big Data literally means nothing more than lots of data. Data analytics offer promising tools to gain insight into your equipment, process, and production relationships. Analytics, however, are only valuable if founded on timely, reliable data with an understood purpose.

Big Data is generally noisy, and its reliability is uncertain. The technology to deal with this uncertainty is still evolving, and the results of analysis are difficult to interpret. Rather than focus on data quantity, manufacturers should focus on collecting smart data from the beginning.

You need storage to back up your data; unfortunately, it’s costly to maintain the integrity and availability of Big Data. Network infrastructure must be robust enough to transfer data where it is needed. Analytical software must also be robust to support analysis of large volumes of data.

Data Flow Diagramming

Data flow diagramming (DFD) is an information technology design tool dating back to the publication of Structured Systems Analysis: Tools and Techniques by Chris Gane and Trish Sarson in 1977. Subsequent DFD tools drew on the Gane and Sarson principles. The most commonly used principles were published by Tom DeMarco (Structured Analysis and System Specification) and Edward Yourdon (Modern Structured Analysis).

One of the values of defining data flow is to develop a deeper understanding of the data your organization collects, how it is processed, and what decisions are made from that data. The model encourages efficiency because it challenges the acquisition of data to ensure that each datum collected is used and has value. The model also develops our understanding of what our collected data means in the context of daily operations. Data flow analysis answers the questions:

  • What data do I collect?
  • Where does data come from?
  • How do I process acquired data?
  • Where do I send data after it’s processed?

If you look closely at your data analysis, you should also ask the questions:

Data flow diagrams comprise four main elements.

Figure 1. Data flow diagrams comprise four main elements. An entity is an external source or destination of data. A process receives data, changes it, and creates an output. The process may perform a computation, direct or change data based on business rules, or reorganize the data. A data store holds information for later use. A data flow shows the direction and flow of data between external sources, processes, and data stores.

  • What data do I need to support good decisions?
  • Do I use available data wisely?
  • If I need additional data, where can I acquire it?

Documenting the flow of data through the organization is important for building applications and knowledge. When you understand the source of data and its transformation, you can make better decisions from the analysis of that data. Production employees are the subject matter experts in your plant. Analysis performed by data scientists can offer important insights but must be assessed by knowledgeable employees. Reasons for trends and events in your data may have meanings beyond what may appear obvious. Your staff should be the first to confirm or reject analysis before management makes important decisions. The potential cost to financial results, production, and safety is too high to accept analysis at face value.Some developers consider DFDs to be obsolete, but they remain one of the best tools to define the flow of data and its role in your operation. They also are designed to be easily understood by both users and technology professionals. Let’s take a look at some of the fundamentals.

The DFD documents the source and flow of data through your respective systems. Figure 1 shows the four fundamental elements of a DFD. The rules for diagramming are:

  • Each process must have at least one input and one output.
  • Each data store must have at least one input and one output.
  • Data stored within a system must go through a process.
  • All process outputs must go to another process or data store.

Analyses are developed in a series of levels. Level 0 provides a high-level overview of the process of interest and its inputs, outputs, sources, and destinations. Each subsequent level provides a more detailed presentation of the higher-level diagram (see Figure 2).

For each component of your DFD, document what it is, its details, and why it is employed. A quality DFD can help you understand, improve, and modify the flow of administrative, logistical, maintenance, and production data through your organization.

DFD is one of the most beneficial tools to help you understand how you can use data to manage the quality of your components and the efficiency of your operation. If you understand the nature of the data you use and collect all meaningful available data, that data can help improve your decision making and the quality of your analytics.

Analyses are developed in a series of levels.

Figure 2. Analyses are developed in a series of levels. Level 0 provides a high-level overview of the process, while each subsequent level provides a more detailed presentation of the higher-level diagram.

About the Author
4M Partners LLC

Bill Frahm

President

P.O. Box 71191

Rochester Hills, MI 48307

248-506-5873