Tag Archives: Batch Processing

In theory, the continuous improvement model encourages organizations to advance positive changes frequently and smoothly. But what about in practice? How can an industry such as oil and gas, currently struggling with so much disruptive change these days, reach a point where practical CI is even possible?

It all starts with culture, that fertile ground from which continuous improvement compliant principles grow into day-to-day practices and work is carried out to completion time and time again.

The roadmap of Continuous Improvement culture-building

There are several aspects to developing CI culture, some of which require creative, as opposed to formulaic, thinking. Think of CI culture-building like preparing for a long journey. Its steps include, but are not limited to, identifying the following:

Armed with an overview of what continuous improvement culture-building entails, let’s turn our attention back on Oil & Gas and discuss why rapidly developing an internal environment for supporting these methodologies matters.

1. Oil & Gas has room for improvement

Many influential organizations have called on Oil & Gas to acclimate to a world driven by environmental and cultural sustainability. Earlier this year, the United Nations Development Program, along with the International Finance Corporation and IPIECA, published a report detailing how private-sector Oil & Gas companies can integrate 17 sustainable development goals into standard operations around the world.

Additionally, the latest information from the Bureau of Labor Statistics, women make up only one-fifth of all U.S. oil and gas extraction jobs, even though women occupy about 47 percent of the entire labor force. Creating environmentally sustainable operations and having more female representation in Oil & Gas both represent worthwhile justification for starting the continuous improvement cycle sooner rather than later.

All Oil & Gas workers, regardless of role, deserve CI value boosts.

2. Individual value creation is imperative in low-price environment

At its essence, CI trains organizations to target and remove waste ad infinitum, which increases the value of the work each CI-compliant employee performs. It also incentivizes leaders to invest in training, as doing so will maximize their returns in the form of a highly intelligent workforce.

Automation in Oil & Gas behaves in a similar fashion, reducing work that doesn’t add value or actively depletes value. Both continuous improvement and automation are necessary, and can easily play off of each other, as Oil & Gas companies aim to minimize their operating expenses in the long term and adjust to a financially leaner industry climate. However, in many instances, CI is the figurative fuel that powers the engine of cost-saving innovations like automation. Advanced software and technology-driven processes will not succeed without a culture that clearly defines their significance to the organization utilizing them.

3. Forming the right Continuous Improvement team takes time

Any business undertaking CI methodologies must first build a team of core members who will strive for success. That takes a lot of careful planning, scheduling, and even permanent alterations to roles within a company.

CI team members must possess a deep understanding of their industries, market performance and the challenges of their unique businesses – all signs point to the inclusion of senior-level management, as well as perhaps a few executive stakeholders, along with a cadre of rank-and-file workers with highly developed skills and specialized knowledge.

But instead of piling continuous improvement related duties on top of traditional job specifications, Oil & Gas companies must rewrite all internal roles to account for CI, which will also mean delegating legacy duties once intended for upper management to new parties down the chain of command. Those are not decisions to enter into lightly, so it behooves businesses to start planning now to implement continuous improvement as soon as possible.

Continuous improvement puts the future of Oil & Gas within reach, but companies must first develop a culture conducive to best practices. From there, augmenting operations and incorporating new elements into the greater business schema will become far easier, and Oil & Gas companies can adapt intelligently to whatever tomorrow brings.

For more information on continuous improvement and operations management in oil and gas, contact a USC Consulting Group representative today.

Industries from biopharm to oil and gas are abuzz with praise for continuous processing technology and the advantages the model brings to their businesses. Traditionally, these advancements take less time, consume less energy and usually have a smaller operational footprint to batch production, depending on the industry and assets utilized. From there, many businesses have seen significant Opex cost reductions, productivity gains, and alternative value-add opportunities.

Hype surrounding continuous processing can be particularly difficult to examine objectively, especially for decision-makers lacking the technical expertise while trying to determine if certain batch processes under their purview are worthy or in need of an upgrade to continuous status.

Do your operations fit the criteria below? Then it may be time to switch. Or, perhaps given what you learn, you must develop other areas first before taking the dive into continuous processing to gain and sustain its benefits.

Continuous processes ‘heating up’ in biopharm and chemical processing
If the science matches up, your company could be a prime candidate for continuous processing. Researchers from the Agency for Science, Technology and Research in Singapore published a study demonstrating how exothermic and endothermic liquid-phase reactions occurring in pharmaceutical or chemical processes could prosper greatly from continuous production methods over batch.

A*STAR scientists noted biopharm companies and chemical producers utilizing the Reformatsky reaction, a “organozinc-catalyzed reaction that frequently overheats with batch processing,” could find value in continuous processing. Using continuous methods in this way, companies could save on labor and resource costs, retain high uptime rates, uphold product quality, and perhaps even leverage efficiency as a means of lowering prices for consumers.

Will continuous processing give you IT nightmares?
A recent Automation World survey conducted for its advertisers inadvertently revealed several crucial differences between business leaders operating continuous processes versus batch processes. In sharing the results, the publication has provided on-the-fence decision-makers with powerful insights into what process changes could mean for their business at large.

The survey found more readers working with continuous processing worried about “technology upgrades” and “cybersecurity” than those working with batch processes. While correlation does not imply causation, Automation World Director of Content and Editor-in-Chief David Greenfield who wrote the accompanying article for the survey raised important points on-the-fencers should not take lightly. With an increase in technological innovation, connectivity, and interoperability gained through the incorporation of cutting-edge continuous processing equipment, the companies capitalizing on it are more than likely to possess a naturally increased awareness for the possibility of system breaches. That said, if your organization already struggles with cybersecurity issues under a batch regime, perhaps it may be best to devote attention to those gaps first before pursuing continuous processing and the tech that makes it possible.

Continuous processing removes many inefficiencies batch producers have struggled with since the dawn of modern industry. However, implementing continuous processes without proper foresight could backfire. Be sure to research how continuous processing has made an impact in your specific industry before integration, if you wish to glean a competitive advantage.

Natural gas production has remained stagnant even as the nation creeps toward cooler weather. Instead, processing plants have begun to increase the variety of products in their portfolios, investing in asset infrastructure for purifying natural gas liquids.

But what does diversification like this mean, especially to an industry focused on cutting Opex costs and optimizing production? What concerns should stay at the forefront of midstream investors’ minds when installing, expanding, or reconfiguring NGL fractionation and distillation equipment?

Plan for market agility through asset utilization
Although the low cost of natural gas may be of benefit to gas-fired energy generators across the country – especially as air conditioning demand trends upward, according to Reuters – companies entrenched in the oil and gas industry must find new methods for capitalizing on goods without saturating the market. Extracting pentanes and other worthy hydrocarbons from NGLs prevents natural gas organizations from tapping extra wells and using the most of the production already available to them.

However, as midstream operations spin ethane, butane, etc. from NGLs, asset expansion necessary to control these varied resources only stands to complicate processing and open up room for mechanical failures, product mishandling, and perhaps even regulatory noncompliance. Additionally, a diversified stock so reliant on domestic and export market performance requires responsiveness to remain a boon to business. When one outperforms others, decision-makers must be at the ready to tilt production accordingly without compromising quality or service.

Maintain cost-effective energy consumption
Industry leaders know distillation columns used in NGL fractionation burn a lot of thermal energy, with as much as 40 percent used on site for “refining and continuous chemical processes,” according to the U.S. Department of Energy.

Labor reductions to extraction upstream trim production to avoid market saturation, but these austerity measures also aim to deflate costs throughout all oil and gas operations while prices remain low. Adding energy-intensive assets without taking energy expenses into consideration may undermine cost-cutting initiatives elsewhere. Apart from balancing the books and ensuring the difference in operational growth doesn’t derail Opex cost reduction, what else can NGL producers and processors do to mitigate how much distillation may grow their energy footprint?

One method, according to the American Institute of Chemical Engineers, involves targeting energy variability through the establishment of pressure controls, particularly for light-hydrocarbon columns. Researchers found even a 7 percent reduction in pressure could yield the typical distillation process a savings of $240,000 in annual energy costs. Moreover, advanced condenser mediums capable of balancing distilled resources at the perfect temperature could more than double those gains.

Oil and gas companies ought to concentrate more on how they run their distillation towers.

Avoid distillation column misuse
How fractionation towers function in a more general sense also matters, especially if on-site NGL distillation has undergone maturation because of process mapping and other physical changes to the layout of a facility.

For instance, Chemical Processing reported how many refiners focus too heavily on condensers and reboilers when they should give equal consideration to column feeds and how they perform around entrant trays. A misplaced feed could force fractionation towers to work overtime and increase their energy demand unnecessarily. This mistake may also cause asset failure due to imbalance, compromising safety and the quality of the product therein, as well as every other NGL that would have been harvested down the chain.

When altering process organization for greater operational efficiency gains, don’t alter feed locations unless data confirms the move won’t jeopardize asset availability and uptime. Remember: Distillation towers are almost the perfect embodiment of the domino effect. If one column becomes compromised, you will almost assuredly lose all others until the problem is remedied.

Fractionation presents natural gas with horizons to conquer and opportunities to turn market troughs into progressive growth as companies expand the scope of their operations. Before integrating new distillation assets or changing how you use the ones already on site, discuss your plans with a knowledgeable consultancy, preferably one with a specialization in continuous processes and utilization, as well as one with a proven track record in the field of oil and gas.

How can a business in the chemical processing sector maintain a commitment to continuous improvement as its industry undergoes a period of financial stagnation or decline?

Optimization Realization 
Optimization isn’t an abstract idea – it’s a discipline rooted in real process changes and data-driven exploration into on-site and remote operations. After all, when companies expend the effort to improve, they’re not competing against competitors per se. They’re competing against the best version of themselves.

Unfortunately, that level of dedication does not come free. According to the American Chemical Society, many key players in the chemical processing sphere experienced turbulence coming into 2016 and throughout the first half of the year. Some stressors were milder than others, but all contribute in some way to a decrease in revenue and could impact how they optimize and innovate:

The question is, are there methods for sustaining optimization initiatives even when budgets are tight and if so, where should chemical processors focus their attention to derive the best results?

Be wary of diversification
Reaching out into new markets may open new frontiers for companies when business is up, but when sales plateau product diversification needs serious consideration. Deloitte research revealed that as industrial production contracted in 2015 across all industries, chemical demand also waned. In response, many chemical processors took the opportunity to focus on retooling their core business rather than take risks with experimental projects. Essentially, they chose to optimize over maximize.

However, PwC chronicled the turmoil of a subsection of chemical processing that didn’t fall in line with this mentality: engineering polymers. According to PwC, one company tried to expand into new areas of business both geographically and through product diversification during this risky market environment. The end result placed significant strain on logistics resources, internal conflict, and supply chain disruption.

So, when it comes to product diversification, how can businesses tell the difference between a sure thing and a dud? By first investing in thorough, unbiased analysis, perhaps from a third party. However, if funds are tight, that money might be better spent on next-generation productivity through an overhaul of the core processes underpinning company culture. Doing so has shown to reduce operational expenditures, thus freeing up more spend for opportunities at more stable junctures.

Understand customers – and yourself – through data
The age of big data is both a blessing and a curse for optimization in chemical processing. On the plus side, it presents an opportunity to forecast fluctuations in demand, materials performance, and internal operations charged with capitalizing on these elements. By leveraging the most actionable data management strategies, chemical companies have the power to amp up their services and dive deeper into the nuances of their industry like never before.

The problem is, so can everybody else. So, while big data can help an individual business accomplish their goals, it simultaneously raises the bar for all industry players in regards to what clients expect as the status quo.

With that in mind, chemical processors should tailor all optimization initiatives toward retaining the customers they already have, instead of playing to the clients in their competitors’ pools. Focus on what separates your company from others, then strive to optimize those services as much as possible. Also, don’t settle on what services you do best, but rather what services you do differently that resonate with your customer base.

How can manufacturers improve QC cycle times while still performing everything they need to stay compliant?

At a glance, the mission of a quality control specialist working in fields like chemical, medical device manufacturing, or life sciences seems different to that of a production manager at the same company. After all, isn’t quality control all about ensuring the safety of the products no matter how long it takes, whereas production itself is far more concerned with meeting quotas and demand on a tight schedule?

Yes and no – while quality control standardizes the manufacturing process to avoid variances harmful to customers and the reputation of the organization at large, QC microbiologists and technicians no doubt have work orders of their own to fill and capacities to reach when it comes to testing. And although production managers or other manufacturing specialists may have output on the mind, they understand without a high standard for quality in operations, their businesses wouldn’t likely have any customer demand in the first place.

Optimizing QC laboratory processes in the manufacturing sector involves a level balancing of both safety and speed without compromising one another. How can manufacturers improve QC cycle times while still performing everything they need to stay compliant?

cycle times

All QC specialists should follow the same guidelines for greater risk prevention and cycle time preservation.

Drill down the basics
Good risk management in a QC lab should outline all methods for quarantining and reversing conditions adversely affecting manufactured goods. That way, microbiologists and lab technicians save resources, perform speedy investigations, and set QC processes back on track after an out-of-specification (OOS) event. However, there’s something to be said about avoiding trouble in the first place when cycle times are at stake.

To that end, the QC lab should take a page from lean manufacturing, particularly on the subject of process standardization and uniformity. The sequence in which technicians prepare for work, process samples, dispose spent resources, or clean lab equipment matters greatly to both the success of the testing and the prevention of widespread contamination. An audit of testing operations performed by laboratory supervisors may reveal areas where technicians’ actions or inactions potentially subvert the constancy of QC processing and production.

If possible, supervisors should look to documentation on past OOS events for hints on where to start looking first to minimize time and resources spent investigating. That said, any small discovery that preempts a contamination event, whether found in either historical data or through careful observation, saves production considerably in cycle time.

Bring in automation
Research published by The Royal Society of Chemistry analyzing the most common errors in chemical laboratories uncovered the greatest threat to QC cycle time stability: humans. The study found problems like sample preparation, uncalibrated equipment, miscalculation, and general human error made up the majority of OOS incidents. While insightful, these findings should come as no surprise to manufacturers, especially those who witnessed the age of manual production give way to automation.

“Manual processes anywhere open businesses up to risk.”

Truth be told, manual processes anywhere in the production cycle open businesses up to risk, perhaps even unnecessarily. The burgeoning field of rapid microbiological and rapid microbial methods devotes itself entirely to finding a solution to this very issue. Manufacturers should likewise devote their time to investigating and investing in innovations that target low-value, high-risk laboratory activities like data keying or slide movement between processing stations and incubators. Focusing on these areas mitigates the risk of production downtime due to contamination, frees up microbiologists for more value-added opportunities and reduces the overall time spent performing these tasks, all supporting better cycle times for the rest of the plant.

Go digital for smarter oversight
There’s a reason why many QC labs have gone digital with laboratory information management systems (LIMS). LIMSes aggregate and galvanize all QC processing data stored therein, so laboratory workers can utilize information in ways that complement faster, more consistent cycle times. Dashboards and other visualizations immediately come to mind. When technicians can easily interpret their workloads and capacity demands at a moment’s notice, they spend more time applying their talent to testing.

Manufacturers should remember to align their investment strategies with cycle time improvement initiatives established above. For instance, if a QC lab still finds value in manually keying data directly into an LIMS, perhaps it should purchase software with manipulable value fields. A single misplaced decimal point could send a laboratory on a costly wild goose chase attempting to find the phantom catalyst that caused an OOS reading. Some LIMS software has the power to prevent technicians from entering numbers or symbols based on prearranged value ranges, so an error in the QC lab doesn’t carry over onto the production floor in the form of downtime.

For spend managers at these organizations hoping to push their enterprise into the 21st century, what value could be achieved by concentrating spend where it’s needed most?

Enterprise spend management stresses the value of smart budgeting with a tight focus on maximizing ROI, optimizing processes, and expanding businesses in meaningful and sustainable ways. One may think the chemical processing and refining industry wouldn’t need to worry too much about how it spends, given how its products permeate as much as 96 percent of the supply chains for all manufactured goods, according to a recent study.

However, the success of the chemical industry as a whole is built on a foundation comprised of companies of all shapes and sizes looking to invest practically in their own prosperity. For spend managers at these organizations hoping to push their enterprise into the 21st century, what value could be achieved by concentrating spend where it’s needed most?

Spend Management

Reliability-centered maintenance
Direct spend management strategies tend to take top billing over indirect spend management. Direct spend management centers around equipment purchases, software integration and other physical assets a company can purchase. Indirect spend management, like coordinating spend on a sound on-site maintenance program, may not be as viscerally appealing as state-of-the-art assets, but can be just as valuable to a chemical processing plant, if not more so.

Reliability-centered maintenance strategies utilize vast stores of data to analyze how plant equipment functions. In doing so, supervisors can track productivity and deficiencies which may evolve into failures over time. Maintenance personnel then address these issues before they exacerbate. From a spend management perspective, this practice undeniably adds value. Proactive maintenance hits on several crucial concepts the chemical processing industry has been moving toward reinforcing: environmental accountability, strengthening employee safety, securing asset availability and uptime, and overall process optimization.

“Asset owners can use maintenance spend to gauge the financial viability of a replacement.”

Furthermore, an updated maintenance plan could guide organizations toward making more intelligent direct spend management decisions. If one asset continues to underperform despite several rounds of proactive maintenance, its owners can use maintenance spend as a metric to gauge the financial viability of a replacement.

Driving out commoditization with R&D investment
Growing commoditization of chemical goods increasingly prevents businesses in the industry from investing in innovation.

In his book Winning at New Products: Creating Value Through Innovation, author Robert G. Cooper explains that although the average time-to-market across all manufacturing sectors has trended downward since the turn of the century by more than 42 percent, new product sales decreased over that same period of time by 15 percent. Many investors cannot justify the “risks” of R&D, like steep upfront costs and long lead time, when highly specialized chemicals already in market favor perform so well.

Desire for a quick buck has officially overridden the urge to innovate, but from a spend management perspective, this narrow mindset offers little in the way of long-term financial sustainability. The chemical industry shouldn’t be afraid to expand its horizons by investing healthfully into a diverse product portfolio, but must do so with tact. For instance, chemical companies should only room in the budget for prospective products that fulfill an explicit objective valuable to their enterprise. In addition, technological investments should serve more than a single purpose, lest they go to waste on the off chance a new product fails to capture the market’s attention.

Chemical processing and other batch manufacturing operations are becoming increasingly complex. Manufacturers today need to have a fast response time when quality issues arise and when equipment requires maintenance. Combining real-time analysis with reliability-centered maintenance is a great way to raise efficiency in chemical processing. A constant stream of data coupled with a proactive maintenance plan can ensure that production quality remains high and that plant assets operate in prime condition.

Real-Time Analysis Improves Product Quality 
Chemical Processing
 explained that in batch manufacturing operations, identifying product quality deviations is not an easy thing to do. Additionally, if production continues, and quality issues are not spotted, the cost of wasted time, materials, and energy adds up quickly. This is why it is beneficial to use predictive control techniques so that product quality remains consistent.

According to Chemical Processing, product quality checks should be done with high frequency and accuracy. Automated feedback control can help manufacturers keep track of operations. However, measurements need to be taken without interrupting processes or increasing risk of contamination. Off-line, at-line, and on-line automated devices can allow measurements to be taken close to the process without slowing down the pace of production. Manufacturers can leverage these technology solutions to minimize or even eliminate time delays, if they are able to use the steady stream of accurate and reliable information.

“Product quality checks should be done with high frequency and accuracy.”

Analyzers will provide measurements in real-time or near-real time to manufacturers, but they must be able to respond quickly. For that to happen, KPIs from the analyzers should be worked into the management operating system so that plant managers know how to respond to spikes or dips in the numbers. Data is only useful if companies know how to use it. This is why manufacturing industry experts pride themselves on having a hands-on approach, because giving consulting advice from the boardroom is not always helpful.

real-time analysis
Use technology to provide real-time analysis to monitor quality

Data Helps Manufacturers Stay Competitive
Chem.info explained that big data is changing the chemical industry in a profound way. Companies can real-time analysis data from production centers to increase margins, improve product quality and shape business processes. Once manufacturers put the systems in place that collect the data, they should then identify the best ways to use the data to reduce costs, manage assets, generate revenue, and make smarter decisions going forward. Plant managers who can effectively interpret the data and respond accordingly will see noticeable improvements in their business. For example, data analysis can help manufacturers make variants of existing products and either lower production cost per unit or produce a higher quality substitute with higher profit margins.

One practical way to use data from real-time analysis is to link it with reliability-centered maintenance. Demonstrating vigilance in monitoring quality and production levels, as well as keeping assets in optimal condition, can lower overheads and raise efficiency. As previously mentioned, the chemical manufacturing process is complex, but properly leveraging data makes things simpler, allowing companies to better manage production assets and throughput processes simultaneously.

It is important to mention, however, that smart technology alone will not provide the desired benefits. Data collection and analysis is only a means by which better decisions and actions are taken. Companies that understand this last point know that educating staff on how to interpret and proactively respond to KPIs is equally as important as investing in new technology. As the manufacturing field continues to be influenced by evolving technology, management skills must evolve as well.

Trade Value for Value– If you are not putting in more than you’re taking out, you are either a thief or a mooch. I don’t want to be either.” –Paul Harker

Paul Harker is a Senior Operations Manager at USCCG and has been with the firm since 1988. He has been among the leaders within USCCG in developing our Inventory Management and Sales & Operations Planning processes. Paul has conducted approximately 160 implementations in projects for over 75 clients located in the US, Canada, Mexico, Denmark, Italy, Germany, Taiwan, and China. Despite his busy schedule, he was kind enough to let me ask a few questions about his recent work with clients in the chemical industry, and I’m excited to share Paul’s insights with all of you.

Chemicals is a broad industry, what types of clients do you work with most often?

In terms of process types, our work has been with Continuous and Batch Processing facilities. The Continuous Processing facilities conduct production campaigns, often lasting many months on a single product. The Batch Processing facilities manufacture discrete batches of various products in self-contained loops of vessels.

The product types range from pharmaceuticals, surfactants, and amines to petrochemicals. Their uses range from industrial applications to additives enhancing our medicines, our food, and even our beer.

How is this industry different from other business sectors?

Chemical manufacturing is extremely capital intensive and frequently requires high research and development costs. This industry is stringently regulated on product specifications, sanitation, environmental impact, and employee and public safety. The high cost of entry makes the competitive landscape relatively stable and the operating margins can be quite attractive.

What kinds of issues are your current clients facing?

Because of the capital invested and the margin opportunities, it is critically important that the On-Stream Time of their plants remains very high. Obviously, if they are not producing, they are not making any money. The percentage of On-Stream Time typically needs to be in the mid to high 90% range.

In addition to being On-Stream, the processes within the facilities need to be operating at or near their Rated Capacity. It does our clients little good if they have 98% On-Stream Time, but are only running at 10% of their Rated Capacity. Both indices need to be impacted in order to elevate the output of the plant.

For some clients, the largest erosion of On-Stream Time takes place during their planned Turnarounds (sometimes referred to as Shutdowns). These are periods when the plant is taken down in order to complete maintenance tasks or process improvements that cannot take place while the facility is running. These Turnarounds may occur annually and require two or more weeks to complete. Keeping these as effective and as short as possible are of huge value.

Other clients may have their Turnarounds well managed, but have smaller bites taken out of their On-Stream Time by thinly managed Down Days. A Down Day is a generic term for when a plant goes down due to an unplanned event, or for a planned maintenance or construction task. The duration may be a few hours or a day or two, but are still referred to as Down Days. Although smaller bites, these also need to be well managed in order to get the maximum value in the shortest amount of time

These facilities are tremendously complex and, even with state of the art Distributed Control Systems (DCS), there are typically thousands of steps required to start-up, run, and ramp-down the plant. Well defined and linked procedures are necessary to safely and effectively operate. This is particularly important as the industry expands and the experienced work force approaches retirement.

Virtually all of these clients face an ever-increasing number of regulations and regulatory bodies. The amount of data required concerning the process, equipment configuration and condition, and maintenance task definition and recording has been growing exponentially. This places more requirements on all levels of the organization as well as on the Information Technology infrastructure and support.

What kinds of solutions and benefits can you bring to your clients?

We can have an impact on On-Stream Time by managing Turnaround (TA) events and Down Days (DD) differently. To knock a few days off of an annual TA we have developed a Turnaround Management Operating System that is a comprehensive, closed-loop system that contains the tools, procedures, and practices to manage the balance between the speed and the effectiveness of TAs. We can get more out of DDs by enhancing the use of the Computerized Maintenance Management Operating System (CMMS), overlaying additional planning steps, and restructuring the responsibilities within the planning and materials groups.

We can also have an impact on the Rated Capacity through shifting the maintenance organizations from a Response Centric approach to a Reliability Centric approach. A plant that runs reliably spends more time at or near the rated capacity. This often means building a Reliability Function with dedicated planners and crafts persons. Within that function, we place inspections, lubrication routes, and Preventive Maintenance tasks as well as Down Day and Turnaround events. Additionally, this often requires conducting Failure Mode and Effect Analysis (FMEA) in order to define the correct work plan and frequency for each piece of equipment. Through the years, we have developed a number of templates that dramatically speed up this otherwise daunting task.

Are there any other industries that could benefit from some of the solutions implemented at your clients’ sites?

Any continuous flow manufacturing facility could benefit from these approach elements. Food processing, pharmaceuticals, pulp and paper, petrochemical production and transmission, waste water treatment, and power plants all come to mind.