Tag Archives: Chemicals

The Trump administration recently published its 2019 budget proposal, which includes deep spending cuts totaling hundreds of billions of dollars. The budget request also calls for the elimination of several federal oversight bodies, including the U.S. Chemical Safety Board, according to Bloomberg. Created in 1990 as part of the Clean Air Act, the independent watchdog leverages $11 million in annual funding to investigate industrial incidents stemming from the mismanagement of caustic chemicals. While the elimination of the CSB seems, on the surface, an ideal development for industrial organizations, some industry leaders and workplace safety experts have expressed skepticism.

Modern manufacturers are deeply invested in protecting their employees, and support the work of bodies such as the CSB as they establish new workplace safety paradigms centered on innovative strategies and technology. Current CSB Chairperson Vanessa Allen Sutherland has received praise from industry leaders for streamlining the agency’s investigation workflows and collaborating more effectively with businesses. Despite these positive developments, however, the agency has been put on the chopping block as part of a wider push for government deregulation.

How would the abolition of the CSB impact firms developing new safety and reliability programs?

 

Addressing chemicals in the workplace
Chemical compounds are among the most serious safety hazards found within industrial work environments, according to the National Safety Council. Manufacturers and other businesses leverage hundreds of different substances in everyday workflows and produce significant amounts of equally dangerous chemical residue. Workers who encounter these materials can suffer serious or sometimes fatal injuries. In fact, approximately 268 American employees died in 2016 because of such exposure events, according to research from the Occupational Safety and Health Administration. Firms in the industrial space are well aware of the dangers that their workers face, which drives them to develop safety and reliability programs that prevent injuries.

Oversight bodies like OSHA and the CSB are heavily involved in these efforts, working with industry stakeholders to create enforceable policies that keep employees safe, even as they encounter risk while performing everyday duties. In 2016, the CSB conducted seven major investigations, including an inquiry into the 2013 explosion at the West Fertilizer Company plant in West, Texas, that killed 15 workers and injured more than 260 others. Through these investigations, the CSB developed best practice recommendations so industrial businesses do not repeat the errors of their less fortunate peers. OSHA adds another dimension by approaching the subject of chemical management from the position of the worker and formulating safety standards that keep employees safe. While businesses in the industrial space have traditionally butted heads with OSHA, they have had a productive relationship with the CSB, which many leaders credit for revolutionizing chemical handling practices here and abroad. Its investigations have resulted in the creation of new guidelines that not only keep workers safe but also reduce costs associated with employee injury.

 

Considering operations after CSB
If Congress embraces the Trump administration’s budget and authorizes the elimination of the CSB, then industrial organizations would have to seek out new external partners and refocus their efforts in order to ensure vigilance in an environment with little federal oversight. The critical insight the agency once provided would be gone, increasing the likelihood of catastrophic events caused by small operational lapses. The West fertilizer plant resulted in more than $230 million in damages to the local community. Without the CSB, another similar situation may develop.

USC Consulting Group can help chemical manufacturers with operating efficiency by developing effective safety and reliability programs for addressing chemical usage in the workplace. Furthermore, our consultants can establish asset performance management programs to ensure facilities are properly maintained with scheduled maintenance and well-planned outages, resulting in their employees staying safe from avoidable mishaps.

Is your organization considering how it might operate in a world without the U.S. Chemical Safety Board? Connect with USC Consulting Group today to learn more on how to improve safety in the workplace.

 

Continuous Improvement (CI) for process industries can feel like trying to fix plumbing with the water turned on. No matter how carefully you approach the problem, messes will occur.

At petrochemical and bulk chemical plants where operations run around the clock, any downtime subtracts from potential revenue. The only recourse is minimizing losses and augmenting operations in ways that enduringly promote expedient changes while upholding the safety and quality standards of today.

That’s where CI makes its case, particularly when applied to these areas:

With that said, process improvement researchers at technology firm Quandary Consulting Group have found that CI initiatives, while beneficial, often face several barriers that sink success. How have CI implementers in process industries struggled the most to reduce the impact of applying this important discipline to their operations?

They didn’t finish what they started

CI, referred to as kaizen in the circles of lean manufacturing and Six Sigma, does not mean existing in a state of perpetual adjustment, where something is always in progress. Rather, kaizen encourages an openness and willingness to change at all times, a focus on betterment over contentment with the status quo.

Facilities run into trouble when they begin a CI project and fail to fully follow though, instead choosing to look at what lies ahead than what’s in their laps currently. The first thing to go is the detailed real-time analysis, then motivation wanes, then the project lingers on unfinished or is wrapped up hastily. Prevention requires three resources: a senior-level leader to take ownership of the project at hand, a quantifiable terminus, and a method for articulating gains to project stakeholders to energize involvement.

They never considered those most affected by changes

CI pushes adopting companies toward operational perfection, but as a philosophy, it also challenges the behavior of those directly involved in improvement initiatives, both leaders and technicians.

Change management of any name requires advocacy spurred by honest collaboration. Not every improvement will appear as such to certain workers, and indeed they might not be. The benefits of change may not readily present themselves. It’s up to project leaders to communicate what operational excellence will be achieved through improvement. Those leaders must also learn to use that communication as a springboard for changes as opposed to facilitate involvement in the middle or end of a project.

They didn’t recognize when enough was enough

CI projects push limits, but eventually limits push back. Using music as a metaphor, columnist and consultant Alastair Dryburgh illustrated two different ways facilities with continuous improvement initiatives react to those hard limits.

“There’s a very clear standard of what makes a good pianist or violinist,” wrote Dryburgh, “and the difference between the really top people and the next level is very small […] In rock, rap or hip hop, on the other hand, it’s different. Everyone is working in the same sort of genre, roughly, but success comes from establishing a distinctive sound and personality.”

Companies tend to think about continuous improvement too literally, as making something progressively better and better. Eventually, however, they reach a limit where the investment into change far outweighs the slight return. Those organizations must consider the value of innovation as CI. What’s left to determine is whether the limit can go no higher or if doing something daring in another area will achieve even greater operational performance.

They didn’t leverage the right expertise

Advanced process control systems commonly used in process industries bolster best practices for operational data utilization, but they aren’t everything. Big data insights that encourage CI need capable hands to convert those values into actionable strategy. If leaders at processing plants are without such expertise, they should consider a partnership with operational consultants for large-scale projects or for advice on how to train current operators to support future endeavors. For more information, contact the operations management consultants at USC Consulting Group.

Contact USC Consulting Group

Back to top ↑

Industries from biopharm to oil and gas are abuzz with praise for continuous processing technology and the advantages the model brings to their businesses. Traditionally, these advancements take less time, consume less energy and usually have a smaller operational footprint to batch production, depending on the industry and assets utilized. From there, many businesses have seen significant Opex cost reductions, productivity gains, and alternative value-add opportunities.

Hype surrounding continuous processing can be particularly difficult to examine objectively, especially for decision-makers lacking the technical expertise while trying to determine if certain batch processes under their purview are worthy or in need of an upgrade to continuous status.

Do your operations fit the criteria below? Then it may be time to switch. Or, perhaps given what you learn, you must develop other areas first before taking the dive into continuous processing to gain and sustain its benefits.

Continuous processes ‘heating up’ in biopharm and chemical processing
If the science matches up, your company could be a prime candidate for continuous processing. Researchers from the Agency for Science, Technology and Research in Singapore published a study demonstrating how exothermic and endothermic liquid-phase reactions occurring in pharmaceutical or chemical processes could prosper greatly from continuous production methods over batch.

A*STAR scientists noted biopharm companies and chemical producers utilizing the Reformatsky reaction, a “organozinc-catalyzed reaction that frequently overheats with batch processing,” could find value in continuous processing. Using continuous methods in this way, companies could save on labor and resource costs, retain high uptime rates, uphold product quality, and perhaps even leverage efficiency as a means of lowering prices for consumers.

Will continuous processing give you IT nightmares?
A recent Automation World survey conducted for its advertisers inadvertently revealed several crucial differences between business leaders operating continuous processes versus batch processes. In sharing the results, the publication has provided on-the-fence decision-makers with powerful insights into what process changes could mean for their business at large.

The survey found more readers working with continuous processing worried about “technology upgrades” and “cybersecurity” than those working with batch processes. While correlation does not imply causation, Automation World Director of Content and Editor-in-Chief David Greenfield who wrote the accompanying article for the survey raised important points on-the-fencers should not take lightly. With an increase in technological innovation, connectivity, and interoperability gained through the incorporation of cutting-edge continuous processing equipment, the companies capitalizing on it are more than likely to possess a naturally increased awareness for the possibility of system breaches. That said, if your organization already struggles with cybersecurity issues under a batch regime, perhaps it may be best to devote attention to those gaps first before pursuing continuous processing and the tech that makes it possible.

Continuous processing removes many inefficiencies batch producers have struggled with since the dawn of modern industry. However, implementing continuous processes without proper foresight could backfire. Be sure to research how continuous processing has made an impact in your specific industry before integration, if you wish to glean a competitive advantage.

Natural gas production has remained stagnant even as the nation creeps toward cooler weather. Instead, processing plants have begun to increase the variety of products in their portfolios, investing in asset infrastructure for purifying natural gas liquids.

But what does diversification like this mean, especially to an industry focused on cutting Opex costs and optimizing production? What concerns should stay at the forefront of midstream investors’ minds when installing, expanding, or reconfiguring NGL fractionation and distillation equipment?

Plan for market agility through asset utilization
Although the low cost of natural gas may be of benefit to gas-fired energy generators across the country – especially as air conditioning demand trends upward, according to Reuters – companies entrenched in the oil and gas industry must find new methods for capitalizing on goods without saturating the market. Extracting pentanes and other worthy hydrocarbons from NGLs prevents natural gas organizations from tapping extra wells and using the most of the production already available to them.

However, as midstream operations spin ethane, butane, etc. from NGLs, asset expansion necessary to control these varied resources only stands to complicate processing and open up room for mechanical failures, product mishandling, and perhaps even regulatory noncompliance. Additionally, a diversified stock so reliant on domestic and export market performance requires responsiveness to remain a boon to business. When one outperforms others, decision-makers must be at the ready to tilt production accordingly without compromising quality or service.

Maintain cost-effective energy consumption
Industry leaders know distillation columns used in NGL fractionation burn a lot of thermal energy, with as much as 40 percent used on site for “refining and continuous chemical processes,” according to the U.S. Department of Energy.

Labor reductions to extraction upstream trim production to avoid market saturation, but these austerity measures also aim to deflate costs throughout all oil and gas operations while prices remain low. Adding energy-intensive assets without taking energy expenses into consideration may undermine cost-cutting initiatives elsewhere. Apart from balancing the books and ensuring the difference in operational growth doesn’t derail Opex cost reduction, what else can NGL producers and processors do to mitigate how much distillation may grow their energy footprint?

One method, according to the American Institute of Chemical Engineers, involves targeting energy variability through the establishment of pressure controls, particularly for light-hydrocarbon columns. Researchers found even a 7 percent reduction in pressure could yield the typical distillation process a savings of $240,000 in annual energy costs. Moreover, advanced condenser mediums capable of balancing distilled resources at the perfect temperature could more than double those gains.

Oil and gas companies ought to concentrate more on how they run their distillation towers.

Avoid distillation column misuse
How fractionation towers function in a more general sense also matters, especially if on-site NGL distillation has undergone maturation because of process mapping and other physical changes to the layout of a facility.

For instance, Chemical Processing reported how many refiners focus too heavily on condensers and reboilers when they should give equal consideration to column feeds and how they perform around entrant trays. A misplaced feed could force fractionation towers to work overtime and increase their energy demand unnecessarily. This mistake may also cause asset failure due to imbalance, compromising safety and the quality of the product therein, as well as every other NGL that would have been harvested down the chain.

When altering process organization for greater operational efficiency gains, don’t alter feed locations unless data confirms the move won’t jeopardize asset availability and uptime. Remember: Distillation towers are almost the perfect embodiment of the domino effect. If one column becomes compromised, you will almost assuredly lose all others until the problem is remedied.

Fractionation presents natural gas with horizons to conquer and opportunities to turn market troughs into progressive growth as companies expand the scope of their operations. Before integrating new distillation assets or changing how you use the ones already on site, discuss your plans with a knowledgeable consultancy, preferably one with a specialization in continuous processes and utilization, as well as one with a proven track record in the field of oil and gas.

How can a business in the chemical processing sector maintain a commitment to continuous improvement as its industry undergoes a period of financial stagnation or decline?

Optimization Realization 
Optimization isn’t an abstract idea – it’s a discipline rooted in real process changes and data-driven exploration into on-site and remote operations. After all, when companies expend the effort to improve, they’re not competing against competitors per se. They’re competing against the best version of themselves.

Unfortunately, that level of dedication does not come free. According to the American Chemical Society, many key players in the chemical processing sphere experienced turbulence coming into 2016 and throughout the first half of the year. Some stressors were milder than others, but all contribute in some way to a decrease in revenue and could impact how they optimize and innovate:

The question is, are there methods for sustaining optimization initiatives even when budgets are tight and if so, where should chemical processors focus their attention to derive the best results?

Be wary of diversification
Reaching out into new markets may open new frontiers for companies when business is up, but when sales plateau product diversification needs serious consideration. Deloitte research revealed that as industrial production contracted in 2015 across all industries, chemical demand also waned. In response, many chemical processors took the opportunity to focus on retooling their core business rather than take risks with experimental projects. Essentially, they chose to optimize over maximize.

However, PwC chronicled the turmoil of a subsection of chemical processing that didn’t fall in line with this mentality: engineering polymers. According to PwC, one company tried to expand into new areas of business both geographically and through product diversification during this risky market environment. The end result placed significant strain on logistics resources, internal conflict, and supply chain disruption.

So, when it comes to product diversification, how can businesses tell the difference between a sure thing and a dud? By first investing in thorough, unbiased analysis, perhaps from a third party. However, if funds are tight, that money might be better spent on next-generation productivity through an overhaul of the core processes underpinning company culture. Doing so has shown to reduce operational expenditures, thus freeing up more spend for opportunities at more stable junctures.

Understand customers – and yourself – through data
The age of big data is both a blessing and a curse for optimization in chemical processing. On the plus side, it presents an opportunity to forecast fluctuations in demand, materials performance, and internal operations charged with capitalizing on these elements. By leveraging the most actionable data management strategies, chemical companies have the power to amp up their services and dive deeper into the nuances of their industry like never before.

The problem is, so can everybody else. So, while big data can help an individual business accomplish their goals, it simultaneously raises the bar for all industry players in regards to what clients expect as the status quo.

With that in mind, chemical processors should tailor all optimization initiatives toward retaining the customers they already have, instead of playing to the clients in their competitors’ pools. Focus on what separates your company from others, then strive to optimize those services as much as possible. Also, don’t settle on what services you do best, but rather what services you do differently that resonate with your customer base.

How can manufacturers improve QC cycle times while still performing everything they need to stay compliant?

At a glance, the mission of a quality control specialist working in fields like chemical, medical device manufacturing, or life sciences seems different to that of a production manager at the same company. After all, isn’t quality control all about ensuring the safety of the products no matter how long it takes, whereas production itself is far more concerned with meeting quotas and demand on a tight schedule?

Yes and no – while quality control standardizes the manufacturing process to avoid variances harmful to customers and the reputation of the organization at large, QC microbiologists and technicians no doubt have work orders of their own to fill and capacities to reach when it comes to testing. And although production managers or other manufacturing specialists may have output on the mind, they understand without a high standard for quality in operations, their businesses wouldn’t likely have any customer demand in the first place.

Optimizing QC laboratory processes in the manufacturing sector involves a level balancing of both safety and speed without compromising one another. How can manufacturers improve QC cycle times while still performing everything they need to stay compliant?

cycle times

All QC specialists should follow the same guidelines for greater risk prevention and cycle time preservation.

Drill down the basics
Good risk management in a QC lab should outline all methods for quarantining and reversing conditions adversely affecting manufactured goods. That way, microbiologists and lab technicians save resources, perform speedy investigations, and set QC processes back on track after an out-of-specification (OOS) event. However, there’s something to be said about avoiding trouble in the first place when cycle times are at stake.

To that end, the QC lab should take a page from lean manufacturing, particularly on the subject of process standardization and uniformity. The sequence in which technicians prepare for work, process samples, dispose spent resources, or clean lab equipment matters greatly to both the success of the testing and the prevention of widespread contamination. An audit of testing operations performed by laboratory supervisors may reveal areas where technicians’ actions or inactions potentially subvert the constancy of QC processing and production.

If possible, supervisors should look to documentation on past OOS events for hints on where to start looking first to minimize time and resources spent investigating. That said, any small discovery that preempts a contamination event, whether found in either historical data or through careful observation, saves production considerably in cycle time.

Bring in automation
Research published by The Royal Society of Chemistry analyzing the most common errors in chemical laboratories uncovered the greatest threat to QC cycle time stability: humans. The study found problems like sample preparation, uncalibrated equipment, miscalculation, and general human error made up the majority of OOS incidents. While insightful, these findings should come as no surprise to manufacturers, especially those who witnessed the age of manual production give way to automation.

“Manual processes anywhere open businesses up to risk.”

Truth be told, manual processes anywhere in the production cycle open businesses up to risk, perhaps even unnecessarily. The burgeoning field of rapid microbiological and rapid microbial methods devotes itself entirely to finding a solution to this very issue. Manufacturers should likewise devote their time to investigating and investing in innovations that target low-value, high-risk laboratory activities like data keying or slide movement between processing stations and incubators. Focusing on these areas mitigates the risk of production downtime due to contamination, frees up microbiologists for more value-added opportunities and reduces the overall time spent performing these tasks, all supporting better cycle times for the rest of the plant.

Go digital for smarter oversight
There’s a reason why many QC labs have gone digital with laboratory information management systems (LIMS). LIMSes aggregate and galvanize all QC processing data stored therein, so laboratory workers can utilize information in ways that complement faster, more consistent cycle times. Dashboards and other visualizations immediately come to mind. When technicians can easily interpret their workloads and capacity demands at a moment’s notice, they spend more time applying their talent to testing.

Manufacturers should remember to align their investment strategies with cycle time improvement initiatives established above. For instance, if a QC lab still finds value in manually keying data directly into an LIMS, perhaps it should purchase software with manipulable value fields. A single misplaced decimal point could send a laboratory on a costly wild goose chase attempting to find the phantom catalyst that caused an OOS reading. Some LIMS software has the power to prevent technicians from entering numbers or symbols based on prearranged value ranges, so an error in the QC lab doesn’t carry over onto the production floor in the form of downtime.

For spend managers at these organizations hoping to push their enterprise into the 21st century, what value could be achieved by concentrating spend where it’s needed most?

Enterprise spend management stresses the value of smart budgeting with a tight focus on maximizing ROI, optimizing processes, and expanding businesses in meaningful and sustainable ways. One may think the chemical processing and refining industry wouldn’t need to worry too much about how it spends, given how its products permeate as much as 96 percent of the supply chains for all manufactured goods, according to a recent study.

However, the success of the chemical industry as a whole is built on a foundation comprised of companies of all shapes and sizes looking to invest practically in their own prosperity. For spend managers at these organizations hoping to push their enterprise into the 21st century, what value could be achieved by concentrating spend where it’s needed most?

Spend Management

Reliability-centered maintenance
Direct spend management strategies tend to take top billing over indirect spend management. Direct spend management centers around equipment purchases, software integration and other physical assets a company can purchase. Indirect spend management, like coordinating spend on a sound on-site maintenance program, may not be as viscerally appealing as state-of-the-art assets, but can be just as valuable to a chemical processing plant, if not more so.

Reliability-centered maintenance strategies utilize vast stores of data to analyze how plant equipment functions. In doing so, supervisors can track productivity and deficiencies which may evolve into failures over time. Maintenance personnel then address these issues before they exacerbate. From a spend management perspective, this practice undeniably adds value. Proactive maintenance hits on several crucial concepts the chemical processing industry has been moving toward reinforcing: environmental accountability, strengthening employee safety, securing asset availability and uptime, and overall process optimization.

“Asset owners can use maintenance spend to gauge the financial viability of a replacement.”

Furthermore, an updated maintenance plan could guide organizations toward making more intelligent direct spend management decisions. If one asset continues to underperform despite several rounds of proactive maintenance, its owners can use maintenance spend as a metric to gauge the financial viability of a replacement.

Driving out commoditization with R&D investment
Growing commoditization of chemical goods increasingly prevents businesses in the industry from investing in innovation.

In his book Winning at New Products: Creating Value Through Innovation, author Robert G. Cooper explains that although the average time-to-market across all manufacturing sectors has trended downward since the turn of the century by more than 42 percent, new product sales decreased over that same period of time by 15 percent. Many investors cannot justify the “risks” of R&D, like steep upfront costs and long lead time, when highly specialized chemicals already in market favor perform so well.

Desire for a quick buck has officially overridden the urge to innovate, but from a spend management perspective, this narrow mindset offers little in the way of long-term financial sustainability. The chemical industry shouldn’t be afraid to expand its horizons by investing healthfully into a diverse product portfolio, but must do so with tact. For instance, chemical companies should only room in the budget for prospective products that fulfill an explicit objective valuable to their enterprise. In addition, technological investments should serve more than a single purpose, lest they go to waste on the off chance a new product fails to capture the market’s attention.

Manufacturing chemicals for use in other industries could be rightfully described as the backbone of the U.S. gross domestic production. According to an American Chemistry Council report, 96 percent of all manufactured goods can be traced back to chemical manufacturers in one way or another. Such a ubiquitous industry comes with its fair share of responsibilities, as well as duties that can often clash with companies trying to optimize processes for enhanced production, streamlined operations and reduced costs. Chemicals sit so high on the supply chain, any efficiencies gained in these early stages have the potential to trickle down and empower other business associates.

But lean chemical manufacturing and meeting the needs of the manufacturing industry at large, don’t have to be mutually exclusive. Chemical manufacturing can support all the industries dependent on its services while still staying accountable to regulations and burgeoning industry trends.

“A balanced framework for delivery reduces a chemical manufacturer’s chances of overproduction.”

Environmental and operational sustainability through end-of-pipe treatment development
Applying lean best practices to eliminate operational waste can synchronize with end-of-pipe (EOP) waste reduction strategies. In the obvious sense, when a chemical manufacturer produces less waste as a byproduct of raw material creation, it may not necessarily indicate a drop in production. This change could just as easily demonstrate how chemical manufacturers have addressed operational deficiencies in their own processes which, when corrected, yield minimal waste. For example, companies that use the chemicals these manufacturers process won’t want a massive inventory cluttering their storage facilities, slowly expiring as the products go unused. Creating a balanced framework for delivery reduces a chemical manufacturer’s chances of overproduction, generating untenable amounts of waste for little discernible reason.

Another aspect of waste management in chemical processing is EOP treatment protocol, which has in the past gone to the wayside because of its perceived inability to add value to chemical manufacturing. Instead, as managerial mindsets swing further from environmental sustainability to fiscal solvency, EOP treatment became ancillary to production, an afterthought. However, with public opinion readjusting chemical manufacturing’s focus back on greener sensibilities, chemical manufacturers have had to concentrate on carving out justification for these important processing steps.

The truth is, enhancing EOP treatment has considerable benefit to businesses looking to improve on-site operations. According to a study from Malardalen University in Sweden, depending on the treatment processes already in place and planned for the future, chemical manufacturers can utilize technology capable of reclaiming waste materials for secondary use, reducing environmental waste, and uncapping material cost potential. Moreover, strengthened EOP treatment can act as insurance against downtime. As federal regulators crack down on chemical manufacturing to curb pollution rates, manufacturers who spend extra resources on waste treatment equipment may avoid costly shutdowns associated with a lax regard for the environment and safety.

“Granular data analysis is critical for seeding out and identifying operational shortcomings.”

Heightened efficiency through multifaceted data management
Chemical manufacturing already permeates nearly all areas of the industry. That said, there’s no glass ceiling on the potential for chemical manufacturing to continue effecting change on the larger American economic landscape. After all, we are talking about a more than $800 billion industry – so says the ACC – that not only supports almost every U.S. retailer, but contributes greatly to the expansion and modernization of American logistics and infrastructure as well.

With all these moving parts and fingers in proverbial soups, the potential for an individual manufacturer to compromise its own operations is quite high. For starters, since chemical manufacturing precedes the raw materials stage of many other manufacturer’s supply chains, these companies have the power to understand market volatility in areas like feedstock and energy better than their partners. Unfortunately, this can put undue stress on these same chemical companies to provide up-to-the-second price reporting for their own posterity, let alone others down their supply chains. With that in mind, discovering ways to integrate those financial data streams seamlessly into manufacturing processes also spurs valuable innovation in chemistry, essentially expanding scientific knowledge as a whole. The stakes are great, but the rewards are many.

Chemical Manufacturing

Chemical manufacturing can support all the industries dependent on its services while still staying accountable to regulations and burgeoning industry trends.

Proactive data management for greater operational efficiency doesn’t end with mere price reporting, but extends out onto the plant floor.  Recent research stated production variability for process chemicals experiences “extreme swings,” greater than many other manufacturing fields, and granular data analysis is critical for seeding out and identifying operational shortcomings. As an example, the organization followed a single unnamed chemical company as it attempted to deploy advanced “neural-network” data analytic strategies to study the impact of an array of environmental conditions – like temperature and coolant pressure – on yield sensitivity. After a thorough assessment, the chemical manufacturer was able to reduce both its raw materials waste and energy consumption by 20 and 15 percent respectively. To follow this one company’s example, chemical manufacturers will need to invest in the right kinds of equipment capable of deriving, auditing, and analyzing pertinent data to reach optimal efficiency.

Chemical processing and other batch manufacturing operations are becoming increasingly complex. Manufacturers today need to have a fast response time when quality issues arise and when equipment requires maintenance. Combining real-time analysis with reliability-centered maintenance is a great way to raise efficiency in chemical processing. A constant stream of data coupled with a proactive maintenance plan can ensure that production quality remains high and that plant assets operate in prime condition.

Real-Time Analysis Improves Product Quality 
Chemical Processing
 explained that in batch manufacturing operations, identifying product quality deviations is not an easy thing to do. Additionally, if production continues, and quality issues are not spotted, the cost of wasted time, materials, and energy adds up quickly. This is why it is beneficial to use predictive control techniques so that product quality remains consistent.

According to Chemical Processing, product quality checks should be done with high frequency and accuracy. Automated feedback control can help manufacturers keep track of operations. However, measurements need to be taken without interrupting processes or increasing risk of contamination. Off-line, at-line, and on-line automated devices can allow measurements to be taken close to the process without slowing down the pace of production. Manufacturers can leverage these technology solutions to minimize or even eliminate time delays, if they are able to use the steady stream of accurate and reliable information.

“Product quality checks should be done with high frequency and accuracy.”

Analyzers will provide measurements in real-time or near-real time to manufacturers, but they must be able to respond quickly. For that to happen, KPIs from the analyzers should be worked into the management operating system so that plant managers know how to respond to spikes or dips in the numbers. Data is only useful if companies know how to use it. This is why manufacturing industry experts pride themselves on having a hands-on approach, because giving consulting advice from the boardroom is not always helpful.

real-time analysis
Use technology to provide real-time analysis to monitor quality

Data Helps Manufacturers Stay Competitive
Chem.info explained that big data is changing the chemical industry in a profound way. Companies can real-time analysis data from production centers to increase margins, improve product quality and shape business processes. Once manufacturers put the systems in place that collect the data, they should then identify the best ways to use the data to reduce costs, manage assets, generate revenue, and make smarter decisions going forward. Plant managers who can effectively interpret the data and respond accordingly will see noticeable improvements in their business. For example, data analysis can help manufacturers make variants of existing products and either lower production cost per unit or produce a higher quality substitute with higher profit margins.

One practical way to use data from real-time analysis is to link it with reliability-centered maintenance. Demonstrating vigilance in monitoring quality and production levels, as well as keeping assets in optimal condition, can lower overheads and raise efficiency. As previously mentioned, the chemical manufacturing process is complex, but properly leveraging data makes things simpler, allowing companies to better manage production assets and throughput processes simultaneously.

It is important to mention, however, that smart technology alone will not provide the desired benefits. Data collection and analysis is only a means by which better decisions and actions are taken. Companies that understand this last point know that educating staff on how to interpret and proactively respond to KPIs is equally as important as investing in new technology. As the manufacturing field continues to be influenced by evolving technology, management skills must evolve as well.

With worldwide prices for many commodities like precious metals, chemicals, and even food products continuing to fluctuate, many industries are struggling to match their production with market demand. In addition to the havoc this causes with labor requirements and planning, it has a huge impact the availability of major production-related assets.

Advances in technology continue to change industrial processes therefore reducing the amount of work done by a skilled labor force; given that, a strong Asset Maintenance Program is a necessity.

Here’s why:

1) The increased need for major capital-intensive assets like earth-moving equipment, cranes, tankers, drill rigs, etc. has become essential to boosting production, so the need to leverage these fixed assets has never been more critical. Consider that the value generated from increased asset utilization has a multiplier effect. Just a 1% improvement can have an exponential impact on the bottom line.

2) While the above point may seem obvious, the caveat is that it is often not fully realized…Why? Because while these assets may be necessary, they have become harder to maintain and operate, which has decreased the value of the production increases they generate. As a result preventive asset maintenance with a reliability-centered maintenance (RCM) approach is critical to keeping these assets performing at or near their optimal level.

This approach applies to not just movable equipment, but to entire facilities like power generation plants, chemical, oil & gas refineries, and manufacturing facilities. The turnaround time for shutdowns and planned outages can be greatly reduced with a strong Asset Maintenance plan.

3) Better leveraging fixed assets by keeping operating costs low should be one goal for any company involved in production, processing, or any heavy industry. Revenue-generating assets can represent billions in capital investment. Appropriate preventive maintenance requires a robust Asset Maintenance program and will keep your assets well maintained and capable of meeting present and future production demands

Maximizing the utilization of revenue producing heavy assets is a key driver for financial performance!