-
Subscribe to Blog:
SEARCH THE BLOG
CATEGORIES
- Aerospace
- Asset Maintenance
- Automotive
- Blog
- Building Products
- Case Studies
- Chemical Processing
- Consulting
- Food & Beverage
- Forestry Products
- Hospitals & Healthcare
- Knowledge Transfer
- Lean Manufacturing
- Life Sciences
- Logistics
- Manufacturing
- Material Utilization
- Metals
- Mining
- News
- Office Politics
- Oil & Gas
- Plastics
- Private Equity
- Process Improvement
- Project Management
- Spend Management
- Supply Chain
- Uncategorized
- Utilities
- Whitepapers
BLOG ARCHIVES
- December 2024 (3)
- November 2024 (2)
- October 2024 (6)
- September 2024 (5)
- August 2024 (5)
- July 2024 (6)
- June 2024 (3)
- May 2024 (3)
- April 2024 (4)
- March 2024 (3)
- February 2024 (4)
- January 2024 (5)
- December 2023 (2)
- November 2023 (1)
- October 2023 (6)
- September 2023 (3)
- August 2023 (4)
- July 2023 (2)
- June 2023 (3)
- May 2023 (7)
- April 2023 (3)
- March 2023 (3)
- February 2023 (5)
- January 2023 (6)
- December 2022 (2)
- November 2022 (5)
- October 2022 (5)
- September 2022 (5)
- August 2022 (6)
- July 2022 (3)
- June 2022 (4)
- May 2022 (5)
- April 2022 (3)
- March 2022 (5)
- February 2022 (4)
- January 2022 (7)
- December 2021 (3)
- November 2021 (5)
- October 2021 (3)
- September 2021 (2)
- August 2021 (6)
- July 2021 (2)
- June 2021 (10)
- May 2021 (4)
- April 2021 (5)
- March 2021 (5)
- February 2021 (3)
- January 2021 (4)
- December 2020 (3)
- November 2020 (3)
- October 2020 (3)
- September 2020 (3)
- August 2020 (4)
- July 2020 (3)
- June 2020 (5)
- May 2020 (3)
- April 2020 (3)
- March 2020 (4)
- February 2020 (4)
- January 2020 (4)
- December 2019 (3)
- November 2019 (2)
- October 2019 (4)
- September 2019 (2)
- August 2019 (4)
- July 2019 (3)
- June 2019 (4)
- May 2019 (2)
- April 2019 (4)
- March 2019 (4)
- February 2019 (5)
- January 2019 (5)
- December 2018 (2)
- November 2018 (2)
- October 2018 (5)
- September 2018 (4)
- August 2018 (3)
- July 2018 (2)
- June 2018 (4)
- May 2018 (3)
- April 2018 (3)
- March 2018 (2)
- February 2018 (2)
- January 2018 (1)
- December 2017 (1)
- November 2017 (2)
- October 2017 (2)
- September 2017 (1)
- August 2017 (2)
- July 2017 (2)
- June 2017 (1)
- April 2017 (3)
- March 2017 (3)
- February 2017 (2)
- January 2017 (2)
- December 2016 (2)
- November 2016 (4)
- October 2016 (4)
- September 2016 (3)
- August 2016 (6)
- July 2016 (4)
- June 2016 (4)
- May 2016 (1)
- April 2016 (3)
- March 2016 (4)
- February 2016 (2)
- January 2016 (4)
- December 2015 (3)
- November 2015 (3)
- October 2015 (1)
- September 2015 (1)
- August 2015 (4)
- July 2015 (6)
- June 2015 (4)
- May 2015 (7)
- April 2015 (6)
- March 2015 (6)
- February 2015 (4)
- January 2015 (3)
CONNECT WITH US
Tag Archives: Variability
Remember when artificial intelligence (AI) was a glimmer on the horizon? And then ChatGPT stormed onto the scene and people were convinced every job out there was soon going to be replaced by a bot? Now it turns out, not so much.
As awesome (and we don’t use that word lightly) as AI is, it’s only as good as the data it has to work with. At USC Consulting Group, we’re finding this is especially true when we’re using AI for predictive analytics. AI doesn’t like variation, and there can be a lot of that in manufacturing processes.
Here’s a look into this issue and how to handle it.
A short primer into AI and predictive analytics
AI is a broad term describing computer systems that perform intelligent tasks, like reasoning, learning, problem solving, and more. Not so obvious is predictive analytics, which is the ability to forecast future outcomes using AI based on data. You’re already familiar with it, to a certain degree. If you’ve ever had a recommendation from Netflix based on what you’ve watched in the past, that’s it. In a nutshell.
Netflix’s use of predictive analytics created a seismic shift in consumer expectations. This technology also has the potential to transform operating procedures and processes for many industries.
It’s extremely powerful when dealing with processes in which multiple predictors are influencing outcomes. It has the ability to tell us which path to take in order to achieve a desired outcome, even when process patterns and trends are changing.
It means greater precision and accuracy, speed and increased efficiency, the holy grails for any manufacturer.
But there is a fly in this cyber ointment.
Variation.
AI doesn’t like it and – low and behold – that means humans are necessary in this process in order for predictive analytics to achieve its potential.
What is variation?
When we’re talking about manufacturing processes, what exactly does variation mean?
In manufacturing, variation is the difference between an actual measure of a product characteristic and its target value. Excessive variation often leads to product discard or rework.
When you’re dealing with high process variation and instability, it degrades efficiency, consistency and ultimately, profits. A key manufacturing performance objective is the establishment of stable and predictable processes that limits variation – minimum variation around target values.
A main focus for USC Consulting Group is to identify the root causes of variation and address them. Generally, it boils down to people, components and materials.
Some examples to causes of variation include:
- Poor product design
- Poorly designed processes
- Unfit operations
- Unsuitable machines/equipment
- Untrained operators
- Variability from incoming vendor material
- Lack of adequate supervision skills
- Changing or inadequate environmental conditions
- Inadequate maintenance of equipment
- Inadequate or changing environmental conditions
It can be one of these factors, several, or something else. But whatever it is, it’s impeding our ability – and the bot’s – to predict outcomes.
Minimizing variation with our Customized Quality System (CQS)
Every situation is different. The cause of variation on one manufacturing line isn’t going to be the same on another. USCCG assesses and evaluates client processes, then applies a customized approach using a series of tools, techniques and methods that is most applicable in addressing the causes of variability. This customized approach enables USCCG to address variability in an efficient manner. We call it our Customized Quality System (CQS).
We review processes from “the cradle to the grave” and identify the highest-impact operations, then drill down to the tasks and steps within those operations until we uncover the culprits.
Although every situation is different, the general roadmap includes:
- Carefully defining the problem
- Selecting the right team
- Objectively identifying high-impact operations
- Drilling down into the tasks within those operations
- Brainstorming possible causes on those high-impact tasks
- Recommending and implementing deeply focused corrective actions
- Controls so it doesn’t happen again
Removing variability through our CQS not only has an immediate impact on improved product conformance but also paves the way for AI to do its job in predictive analytics, i.e., we want predictions with minimum variability.
It’s just one way USC Consulting Group is using the human touch to make sure AI is up to the job.
Read more about this in our free eBook, “AI and Machine Learning: Predicting the Future Through Analytics.”
Statistical process control (SPC) is a commonly used machine learning software in manufacturing that measures the consistency of a product’s performance based on its design specifications. Minimizing variability is a crucial part of avoiding defects and maintaining resilient manufacturing operations.
This guide outlines the different ways that businesses can effectively utilize SPC and reap all of the benefits this technology has to offer.
How Statistical Process Control Works
SPC is a tried and true technology that businesses have been using for more than 100 years to improve their manufacturing operations. It conducts ongoing statistical analyses, taking into account factors such as the materials, design, employees who handled the product and the machinery used to create the product.
SPC’s constant vigilance enables businesses to make swift and accurate resolutions to quality control problems. However, it’s not fully autonomous like other manufacturing software that can identify statistical correlations without human help. Instead, it relies on large amounts of training datasets that another source must manually input to achieve the desired results.
This form of machine learning is known as supervised learning. Businesses can input human-labeled datasets by themselves, or they can recruit another algorithm to automatically input statistics in a process called “machine annotation.” In either case, SPC needs to absorb as much raw data as possible to maximize its efficiency.
SPC displays its findings in easy-to-read control charts, and it’s the business’s responsibility to set the parameters for each chart by providing the software with enough information. This process includes six basic steps:
- Define the manufacturing process you want to monitor and control by establishing the input variables, output variables, equipment, materials and any other external factors that might affect the process.
- Collect the data that the software extrapolated from the variables you provided, then organize it into a digestible format — usually a chart or spreadsheet.
- Select and construct the control charts based on the type of data you’re using, such as weight, length, temperature and any defects that might have occurred.
- Look for patterns in the control charts that indicate special cause variations in performance due to underlying defects. You can calculate process variability through a capability index, such as C, Cpk, Pp and PPk.
- Investigate the root causes of the variations and make the necessary equipment, material or operational adjustments to correct them.
- Continue to collect and organize data to identify more variations, updating the control specifications as needed.
This process sounds awfully similar to Statistical Quality Control (SQC), but there are some key differences. Statistical Process Control measures independent variables, while SQC strictly focuses on dependent process outputs. SQC also carries out acceptance tests by screening individual product samples, while SPC relies on large datasets and doesn’t have an acceptance testing feature.
Types of SPC Tools
Many types of analysis tools have developed during SPC’s century-long evolution. These tools are split into two main categories — basic tools of quality (7-QC tools) and supplemental tools (7-SUPP tools). Here’s a quick rundown of how businesses can use the 7-QC tools:
- Stratification: separating data into subcategories by unique characteristics to clarify the origins of an existing problem.
- Histogram: A bar graph that displays the frequency of variability and the most common offenders.
- Check sheet: A document with tabular or metric format that tracks the number of special cause variations.
- Cause-and-effect diagram: A chart that shows all of the factors that lead to special cause variations and draws potential correlations between them.
- Scatter diagram: A dotted diagram that displays the overlap between dependent variables on the y-axis and independent variables on the x-axis.
- Control chart: A line-based graph that shows processes’ stability levels and pinpoints the likely variation within produced items.
- Pareto chart: This chart applies the 80/20 principle — 20% of variables cause 80% of problems — to display the most common causes of manufacturing failures.
Stratification also often appears in the 7-SUPP tools category because of its versatility and importance to statistical analysis. Breaking up large datasets into smaller digestible chunks makes SPC software more accurate at identifying problems and reducing variability. Here are the other six 7-SUPP tools:
- Flowchart: A straightforward diagram that outlines the step-by-step process of a manufacturing sequence.
- Defect mapping: A chart that shows the different types of known product flaws within a business’s manufacturing operations.
- Events logs: A variable summary showing the chain of events that resulted from an undesired occurrence.
- Progress centers: Centralized locations dedicated to tracking improvements and supporting informed decision making.
- Randomization: The deployment of random manual and automated input variables to eliminate human bias.
- Sample size determination: Choosing the number of subjects to include in a representative group when tracking manufacturing trends.
Today’s SPC software modules include all of these tools, allowing businesses to access dashboards that display the various charts and diagrams in one place. These insights can lead to identification of quantifiable improvement opportunities that maximize operational efficiency.
Benefits of Using SPC
SPC is one of the most effective machine learning resources for achieving consistent performance in manufacturing operations. Eliminating process errors allows businesses to simultaneously address the three biggest challenges in material handling — workplace hazards, equipment damage and carbon emissions — in many ways:
- Reduces manufacturing costs
- Monitors employee productivity
- Improves resource utilization
- Optimizes manual inspections
- Reduces rework and warranty claims
When these benefits combine, the final result is a more satisfied client base and a more profitable business. While SPC software can’t do all of the inspection work on its own, the tools and insights it provides are invaluable in a manufacturing environment.
Use Statistical Process Control to Its Full Potential
Business leaders who are willing to put in the necessary effort to provide SPC software with large datasets can use this technology to its full potential. They will gain access to numerous eye-opening statistics about operational inefficiencies and have all the knowledge they need to make accurate adjustments.
*This article is written by Jack Shaw. Jack is a seasoned automotive industry writer with over six years of experience. As the senior writer for Modded, he combines his passion for vehicles, manufacturing and technology with his expertise to deliver engaging content that resonates with enthusiasts worldwide.
Recently, I reached out to one of our most charismatic team members, Charlie Payne, and got him to tell me a little bit about his approach to new projects and, more specifically, his work in the food and beverage industry. Charlie is a Senior Operations Manager at USCCG, and has played an integral role in the development of our strategies and training programs since he came on board in 1990. He has years of experience in various industries including Food & Beverage, Mining, Oil & Gas, Life Sciences, and Manufacturing. Charlie’s innovative solutions and ability to build strong client relationships are the foundation for his long record of successfully completed projects. Here’s what he had to say:
When I start a new project, I certainly like to get face to face as soon as possible so we can get a sense of each other, understand the issues, and decide if we might be a fit. Usually, a half-day on site is enough to see if we want to do a more detailed two-week analysis where we’ll come in with a team and put together a business case and answer three questions:
- Is USCCG the right group to help drive results with this client – do we hit their team right?
- Are the issues we see addressable by USCCG in a timeline that makes sense? Is their management team open to change?
- Is there a viable business case? Usually we strive for a 2 to 4 return on the project costs, depending on the scope of the project. Generally, a bigger scope means a bigger return.
If the business case makes sense, then I like to go in to project mode quickly – the consistency of the team we use is important to us and a lot easier than having an extended decision process that means I need to acquaint a new team to the client.
As a Senior Operations Manager, I am responsible for delivering the results of the project and managing the partnership we have with our client. Working with the bench strength we have in our consultants – full time USCCG employees, many of whom I have worked with for years – can often make the projects successful and fun. To have fun with the client and my team is a key success factor for me.
I have worked with all types of clients, and I enjoy working in the food & beverage industry because it’s unlike other business sectors due to of the variability. In food processing, the input can often be so variable – size, quality, quantity – it can often make us wonder why we try to manage the process at all, when we are out of control right from the start! If you can’t affect the input, then in my opinion, it makes it even more important to control what you can – usually the process within the plant walls.
We’re certainly not making widgets and it’s not as precise as manufacturing. When dealing with nature we can be affected by harvest size, variations in bird or hog size, droughts, so you never know what you’re going to get. Combine that with a work force made up of a multitude of cultures, languages, and literacy skills and you have challenges getting consistency into the process. After 24 years and many businesses, the hardest position I have ever seen is to be a supervisor in further processing in a cold plant.
My work in the food & beverage industry has certainly been across the spectrum from large privately held to co-op, slaughter facilities to further processing. Lately, I’ve seen an increase in organic clients, as well as those expanding heavily overseas and wanting to right size their facilities.
The fierce competition and amalgamation from emerging markets are part of why our food and beverage clients look to us for opportunities to improve their processes to stay ahead of their competitors.
When we engage clients in this industry, we work across a number of fronts – from production plants with a focus on yield, throughput, and productivity – OEE type metrics, installing a Management Operating System (MOS) in the plant to bring consistency between shifts, lines, and plants.
Bringing in our LINCS Business Intelligence solution to allow for the key metrics to be known in a timely manner helps us focus management on the items they need to address in getting the performance they need. More recently, we’ve worked on reducing cost with a focus on the purchasing department and the spend on packaging materials, bringing great benefits to our food & beverage clients.
I believe good management practices are applicable across all industries and our solutions for our food & beverage clients can and have proven to be successful for various other clients.