Tag Archives: Continuous Processing


In life, bottlenecks crop up from time to time; moments where we’re stuck and can’t push through a problem. Patience may be the only remedy. In the manufacturing sector, however, bottlenecks can be a serious drain on productivity, revenue, efficiency, and asset utilization.

What does a bottleneck mean in manufacturing terms?
When people think casually of “bottlenecks,” they might think of forced congestion, like traffic on a multi-lane highway pinched down to a single lane for construction or emergency reasons. Traffic would be an apt metaphor, as bottlenecks on the road and those in a manufacturing plant are both concerned with throughput and achieving continuous flow.

However, in manufacturing, bottlenecks have their own clear-cut definition as well. According to the Institute of Industrial and Systems Engineers, bottlenecks occur when:

To identify bottlenecks, plant managers ought to utilize operational data and tighten their focus on steps in assembly that meet these specifications or come close. However, seasoned plant employees and equipment operators may be able to sense and point out potential bottlenecks without crunching the numbers.

What do bottlenecks look like on the plant floor?
For some manufacturing processes, bottlenecks are easy to spot. For instance, if asset operators in a bottling facility – to continue with today’s theme – notice visible accumulation of goods clogging a labeler, chances are the labeler is the issue, so long as all other operations appear to be functioning to capacity.

Manufacturing Bottlenecks
“One common indicator of a bottleneck is inventory overabundance.”

Other times, bottlenecks are not as readily apparent, but misbehaving processes ancillary to production may clue plant managers into trouble elsewhere. One common indicator of a bottleneck is inventory overabundance. When manufactures have aligned inventory needs against demand properly, inventories should remain relatively lean. Materials overflow in that context, therefore, would signal capacity issues somewhere in production. All that’s left to do is hunt them down and sort them out.

How can manufacturers overcome bottlenecks?
Once plant managers locate their bottleneck, they must perform three basic functions to formulate a solution. First, they must temporarily reduce the capacity of the entire process, carefully watching how a bottleneck functions for any observable performance problems.

Second, upon gathering a few notions as to what may be creating a bottleneck, plant managers must then conduct root cause analysis on each. Root cause analysis involves tracing the conspicuous bottleneck issue back to its true catalyst. Perhaps it’s as simple as a mechanical failure, or maybe as comprehensive as accidental overproduction.

Finally, after arriving at the bottleneck’s root cause, plant managers should assess whether it is a long- or short-term concern. Long-term concerns may require organizational change to correct the issue once and for all, but may require scheduled maintenance or production downtime to finalize the fix.

Short-term concerns generally correct themselves with little to no intervention. However, in the meantime, plant managers may feel more secure in their operations by decentralizing capacity over multiple employees or machinery. That way, should an out-and-out failure occur, manufacturers minimize the effect of downtime on productivity.

Trade advisory consultants at Vigilant Global Trade Services, a turn-key solutions provider for every aspect of global trade management, configured a detailed infographic illustrating how to avoid bottlenecks in your supply chain:


Vigilant Global Trade Services infographic illustrating how to avoid manufacturing bottlenecks in your supply chain.


Contact us today to learn more about how you can avoid manufacturing bottlenecks and optimize your operations.


Back to top ↑

The U.S. chemicals manufacturing space is poised for growth following years of middling performance. Worldwide economic development, improvements in the domestic manufacturing and oil and gas industries, and the completion of improved chemical production infrastructure are likely to drive historic gains over the next two years, researchers from the American Chemistry Council found. Production volumes are expected to increase 3.7 percent in 2018 and 3.9 percent in 2019, laying the groundwork for an industrial valuation of more than $1 trillion by 2022. How has the sector managed to regain ground in the marketplace? Increased production capacity linked to digitization.

Advanced hardware and software are transforming chemicals producers of all sizes, facilitating efficiency gains across virtually all operational areas, from the back office to the shop floor, according to the World Economic Forum. By 2025, digital technology will have generated a cumulative economic value of between $310 billion and $550 billion within the worldwide chemicals space. There are, of course, countless solutions and deployment methods specially designed for use within the chemicals manufacturing arena. However, innovations centered on asset management stand above the rest in terms of demonstrable operational impact.


Unpacking the asset management equation

Chemical companies live and die by the mission-critical machinery they use to craft their product. In the event of unexpected downtime, the entire operation grinds to a halt, customers orders go unfilled, and revenue drops. Losses can increase at an accelerated rate when this occurs. For example, the average automotive manufacturer loses an estimated $22,000 per minute of unplanned production stoppage, according to research from Advanced Technology Services and Nielsen. Only the largest companies can weather such losses. Small or midsize organizations might falter entirely under the weight of such astronomical downtime costs.

Chemicals manufacturers are at great risk for suffering such events due to the very nature of their work. These firms supply the market with more than 100,000 different chemical compounds, according to the WEF. The vast majority of these substances are extremely caustic and therefore wreak havoc on production assets, requiring major investments in maintenance. In the U.S., chemicals producers are expected to spend more than $1.26 billion on planned maintenance activities in 2018, constituting a year-over-year rise of more than 38 percent, analysts for the ACC found. Of course, this figure does not take into account unplanned work, which usually costs 2 to 5 times more than scheduled activities, according to the Marshall Institute.


Implementing an innovative solution

Rising costs and the continual existence of massive maintenance-related risk has forced businesses in the chemicals manufacturing arena to embrace bleeding-edge technology in hopes of streamlining asset management workflows and ultimately improving reliability. Many are turning toward predictive maintenance processes powered by connected equipment sensors and robust backend platforms, Schneider Electric reported. These all-encompassing solutions allow chemical companies of all sizes to closely monitor their production assets and catch small mechanical issues before they devolve into full-on catastrophes with the potential to cause downtime. Such products also give producers the power to continually fine-tune their machinery, embrace continuous improvement, and boost productivity.

Chemicals manufacturing

Early adopters have seen serious results, promoting wider investment in the technologies that underpin such proactive asset management approaches. For instance, businesses across all sectors are expected to spend more than $239 billion on industrial sensor technology alone in 2018, researchers for the International Data Corporation have predicted. Firms in the chemicals manufacturing space are likely to contribute to this spend as they retrofit their production workflows to more effectively compete in an expanding marketplace. However, such technology is unlikely to remain optional for long. Almost 90 percent of chemical company executives believe businesses in the industry that fail to embrace digitization will end up falling behind, the WEF found.

Chemicals manufacturing firms standing on the outside looking in on this trend must act quickly to implement next-generation asset management processes and technology. USC Consulting Group can help. Here at USCCG, we’ve been working with businesses across numerous industries for 50 years, helping them adjust to marketplace transformations of all kinds. Connect with us today to learn more about our work and how our chemicals manufacturing consultants can help your enterprise embrace and benefit from digitization.

Let’s keep in touch – subscribe to our blog in the top right of this page or follow us on LinkedIn and Facebook.



Overall equipment effectiveness is an essential key performance indicator for modern manufacturers.

For a multi plant manufacturer the use of OEE provides an opportunity for internal benchmarking of production processes.  At the plant level, it is a guide of where to focus resources to continuously improve and lower costs.

An ever-changing marketplace further reinforces the need for reliable production equipment, as today’s producers must cultivate agile yet dependable operations to survive. With these challenges in mind, manufacturing stakeholders often place great importance on OEE measures and encourage their teams to work as hard as they can to improve such metrics. However, the reliance on OEE has generated industrywide misconceptions surrounding the KPI, leading many manufacturers to operate with skewed views of the venerated performance standard.

Here are three of the most widely circulated myths about OEE:

1. ‘Elevated OEE figures are everything.’
Most modern operational stakeholders are conditioned to believe that high KPI readings signify success. This perception is based in reality, but some production leaders focus solely on the magnitude of the figure without considering it in context of an entire workflow, according to OEE expert Arno Koch. For example, an organization might implement effective maintenance and operational protocols that produce an OEE of 95 percent, a seemingly excellent mark. Production roadblocks arise, however, if downstream processes are not ready to receive product from an asset functioning at such an immense OEE.

Manufacturers must remember to contextualize OEE and strive for numbers that lay the groundwork for smooth operations. What might those metrics be? Koch said world-class workflows are buoyed by machines running between 35 percent and 45 percent of OEE. In short, manufacturing firms need not shoot for the stars.

2. ‘Firms unconcerned with raising output should stop focusing on OEE.’
On the surface, this reasoning appears to make complete sense. Why allocate considerable manpower, resources and time to elevating OEE when the production ceiling has been reached? Increased output is not the only benefit that accompanies improved equipment effectiveness. When production machinery operates more efficiently, production times drop, along with resource usage and maintenance demands. In the end, this leads to lower costs. And, in the event that production needs to scale upward because of increased market demand or expansion, the existing machinery is ready to support such growth, without a need to invest additional capital in new equipment.

Operational stakeholders should take this into account when considering OEE improvement efforts. Bolstering machine effectiveness is about far more than output.

3. ‘OEE improvement necessitates considerable CAPEX investment.’
Modern manufacturing leaders, most of whom manage firms with tight budgets, are often reluctant to embrace large capital expenditures. For this reason, many manufacturers simply move forward without addressing OEE, as they believe such efforts will come with significant costs. And sure, some organizations do spend a lot on equipment upgrades and other improvements meant to boost the effectiveness of their machines. However, this is not the only way.

Forward-thinking organizations looking to improve OEE often adopt lean principles in lieu of expensive mechanical upgrades. These workflows, which are popular among some of the most successful manufacturing companies in the world, help production teams pare down their workflows and implement continuous improvement efforts, both of which lay the groundwork for increases in OEE. Such firms also embrace digitization, installing cutting-edge back-end systems and equipment sensors that continually track machine performance and offer the data-based insights needed to make meaningful improvements.

Instead of committing to large-scale equipment investments, businesses in the manufacturing space can take more scaled-down approaches and improve the processes surrounding mission-critical production assets to drive higher OEE.

Is your organization interested in moving past these and other OEE myths, and implementing higher-performing production processes? Connect with the experts at USC Consulting Group. For 50 years, our operations management consultants have helped manufacturers grow their businesses and expand their footprints in the marketplace.

Contact us today to learn more about our manufacturing service offerings and expertise.

Back to top ↑

No industrial business that hopes to turn a profit in the era of Industry 4.0 can do so without selecting the right key performance indicators (KPIs). These metrics afford operations managers and senior-level decision-makers critical snapshots of how their plants operate and, more importantly, how they ought to operate in order to compete.

When it comes to Oil and Gas, the KPIs that processing facilities choose and utilize depend on their unique objectives. Even businesses within the same industry may seek vastly different insights from their data. But given the state of O&G in America today, which KPIs often make the cut?

1. Capital project efficiency
Between 2006 and 2013, budgets for capital expenditures into exploration and upstream oil production grew at many E&P firms, but have since failed to deliver on the allocation, according to a report from PricewaterhouseCoopers. Capex concerns have been no doubt exacerbated by steep decreases in per-barrel oil prices then and now.

As the Oil and Gas industry as a whole moves into more challenging drilling and extraction environs, it should consider how to visibly articulate capex project efficiency, a multifaceted measurement that combines adherence to stricter budgetary allowance, maintenance spend, and project overrun as well as creep.

Oil & Gas KPI

2. Attendance and completion of safety training
Safety is notoriously difficult to track. This particular KPI can give on-site safety managers a peek at potential dangers to come. Oil and Gas needs this now more than ever – according to E&E News analysis of data from the Occupational Safety and Health Administration, severe work-related injuries pertaining to “support activities for oil and gas operations” occurred at a rate of nearly 149 per every 100,000 workers. Severe injury includes amputation, in-patient hospitalization, or loss of an eye.

It stands to reason that those who undergo safety training courses will act safely and avoid injury. Safety managers must therefore address any barriers suppressing attendance or completion, and KPIs tracking both will alert them to such issues, especially if they monitor the rate at which completion occurs over time. Advocacy for training will ensure investment into these programs pays off as soon as possible, a crucial factor as many O&G companies face tight margins.

3. Leaks per X customers
Midstream oil and gas firms oversee an incredibly long and intricate distribution network, which requires an understanding of its environmental impact. U.S. Oil and Gas companies manage 2.4 million miles of energy-related pipe and 72,000 miles of crude-oil pipe, according to Pipeline101.org, a website maintained by the Association of Oil Pipe Lines and the American Petroleum Institute.

Aligning maintenance spend with environmental regulations requires a deeper look into how often leaks occur, in relation to customer revenue, and which assets incur the highest repair costs.

Although these KPIs have driven success for others, their real value lies in how facilities implement them through continuous improvement initiatives. To speak to organizational management consultants on how to turn KPI insights into actions, contact USC Consulting Group today.

How reliable is your asset maintenance program

Back to top ↑

Continuous Improvement (CI) for process industries can feel like trying to fix plumbing with the water turned on. No matter how carefully you approach the problem, messes will occur.

At petrochemical and bulk chemical plants where operations run around the clock, any downtime subtracts from potential revenue. The only recourse is minimizing losses and augmenting operations in ways that enduringly promote expedient changes while upholding the safety and quality standards of today.

That’s where CI makes its case, particularly when applied to these areas:

With that said, process improvement researchers at technology firm Quandary Consulting Group have found that CI initiatives, while beneficial, often face several barriers that sink success. How have CI implementers in process industries struggled the most to reduce the impact of applying this important discipline to their operations?

They didn’t finish what they started

CI, referred to as kaizen in the circles of lean manufacturing and Six Sigma, does not mean existing in a state of perpetual adjustment, where something is always in progress. Rather, kaizen encourages an openness and willingness to change at all times, a focus on betterment over contentment with the status quo.

Facilities run into trouble when they begin a CI project and fail to fully follow though, instead choosing to look at what lies ahead than what’s in their laps currently. The first thing to go is the detailed real-time analysis, then motivation wanes, then the project lingers on unfinished or is wrapped up hastily. Prevention requires three resources: a senior-level leader to take ownership of the project at hand, a quantifiable terminus, and a method for articulating gains to project stakeholders to energize involvement.

They never considered those most affected by changes

CI pushes adopting companies toward operational perfection, but as a philosophy, it also challenges the behavior of those directly involved in improvement initiatives, both leaders and technicians.

Change management of any name requires advocacy spurred by honest collaboration. Not every improvement will appear as such to certain workers, and indeed they might not be. The benefits of change may not readily present themselves. It’s up to project leaders to communicate what operational excellence will be achieved through improvement. Those leaders must also learn to use that communication as a springboard for changes as opposed to facilitate involvement in the middle or end of a project.

They didn’t recognize when enough was enough

CI projects push limits, but eventually limits push back. Using music as a metaphor, columnist and consultant Alastair Dryburgh illustrated two different ways facilities with continuous improvement initiatives react to those hard limits.

“There’s a very clear standard of what makes a good pianist or violinist,” wrote Dryburgh, “and the difference between the really top people and the next level is very small […] In rock, rap or hip hop, on the other hand, it’s different. Everyone is working in the same sort of genre, roughly, but success comes from establishing a distinctive sound and personality.”

Companies tend to think about continuous improvement too literally, as making something progressively better and better. Eventually, however, they reach a limit where the investment into change far outweighs the slight return. Those organizations must consider the value of innovation as CI. What’s left to determine is whether the limit can go no higher or if doing something daring in another area will achieve even greater operational performance.

They didn’t leverage the right expertise

Advanced process control systems commonly used in process industries bolster best practices for operational data utilization, but they aren’t everything. Big data insights that encourage CI need capable hands to convert those values into actionable strategy. If leaders at processing plants are without such expertise, they should consider a partnership with operational consultants for large-scale projects or for advice on how to train current operators to support future endeavors. For more information, contact the operations management consultants at USC Consulting Group.

Contact USC Consulting Group

Back to top ↑

In theory, the continuous improvement model encourages organizations to advance positive changes frequently and smoothly. But what about in practice? How can an industry such as oil and gas, currently struggling with so much disruptive change these days, reach a point where practical CI is even possible?

It all starts with culture, that fertile ground from which continuous improvement compliant principles grow into day-to-day practices and work is carried out to completion time and time again.

The roadmap of Continuous Improvement culture-building

There are several aspects to developing CI culture, some of which require creative, as opposed to formulaic, thinking. Think of CI culture-building like preparing for a long journey. Its steps include, but are not limited to, identifying the following:

Armed with an overview of what continuous improvement culture-building entails, let’s turn our attention back on Oil & Gas and discuss why rapidly developing an internal environment for supporting these methodologies matters.

1. Oil & Gas has room for improvement

Many influential organizations have called on Oil & Gas to acclimate to a world driven by environmental and cultural sustainability. Earlier this year, the United Nations Development Program, along with the International Finance Corporation and IPIECA, published a report detailing how private-sector Oil & Gas companies can integrate 17 sustainable development goals into standard operations around the world.

Additionally, the latest information from the Bureau of Labor Statistics, women make up only one-fifth of all U.S. oil and gas extraction jobs, even though women occupy about 47 percent of the entire labor force. Creating environmentally sustainable operations and having more female representation in Oil & Gas both represent worthwhile justification for starting the continuous improvement cycle sooner rather than later.

All Oil & Gas workers, regardless of role, deserve CI value boosts.

2. Individual value creation is imperative in low-price environment

At its essence, CI trains organizations to target and remove waste ad infinitum, which increases the value of the work each CI-compliant employee performs. It also incentivizes leaders to invest in training, as doing so will maximize their returns in the form of a highly intelligent workforce.

Automation in Oil & Gas behaves in a similar fashion, reducing work that doesn’t add value or actively depletes value. Both continuous improvement and automation are necessary, and can easily play off of each other, as Oil & Gas companies aim to minimize their operating expenses in the long term and adjust to a financially leaner industry climate. However, in many instances, CI is the figurative fuel that powers the engine of cost-saving innovations like automation. Advanced software and technology-driven processes will not succeed without a culture that clearly defines their significance to the organization utilizing them.

3. Forming the right Continuous Improvement team takes time

Any business undertaking CI methodologies must first build a team of core members who will strive for success. That takes a lot of careful planning, scheduling, and even permanent alterations to roles within a company.

CI team members must possess a deep understanding of their industries, market performance and the challenges of their unique businesses – all signs point to the inclusion of senior-level management, as well as perhaps a few executive stakeholders, along with a cadre of rank-and-file workers with highly developed skills and specialized knowledge.

But instead of piling continuous improvement related duties on top of traditional job specifications, Oil & Gas companies must rewrite all internal roles to account for CI, which will also mean delegating legacy duties once intended for upper management to new parties down the chain of command. Those are not decisions to enter into lightly, so it behooves businesses to start planning now to implement continuous improvement as soon as possible.

Continuous improvement puts the future of Oil & Gas within reach, but companies must first develop a culture conducive to best practices. From there, augmenting operations and incorporating new elements into the greater business schema will become far easier, and Oil & Gas companies can adapt intelligently to whatever tomorrow brings.

For more information on continuous improvement and operations management in oil and gas, contact a USC Consulting Group representative today.

Running a successful midstream oil and gas company can feel a lot like caring for a two-headed dog, one head for upstream operations and the other downstream. Midstream can’t address one head’s needs over the other, balancing the needs of both can prove complex and even redundant, leading to more work with less payoff. However, midstream can make peace by utilizing the right process optimization strategies that focus on increased transparency for both data and resource management.

Technological transparency for ethane recovery/rejection
Midstream oil and gas companies seeking to optimize their processes won’t succeed with a one-size-fits-all fix. Companies must develop multifaceted plans that cover different paths upstream operations might take according to market demand.

The choice between ethane rejection and recovery represents one such decision. Should upstream oil and gas companies divert its ethane into dry natural gas pipelines, or should they simply move it down the chain as is and send it to fractioners for midstream processing? In either case, how can either event receive an optimization boost so midstream businesses turn a good profit without adding complexity to an already complicated system?

“Data visibility allows midstream companies to discern which way upstream operations are leaning.”

Coordination with upstream and downstream partners through data transparency offers a window for midstream companies to interpret more about their partners’ schedules to effectively plan their own. Market and demand fluctuations scrutinized by either party should be shared via an easily accessible data management portal. Instead of waiting for commands from producers as to the course of action to be taken, data visibility allows midstream companies to discern which way upstream operations are leaning, so midstream can plan accordingly.

It isn’t just about market fluctuations either, as Level 2 Energy reported – the recovery-versus-rejection decision could also be contingent on the amount of ethane producers have recently sold as natural gas. Again, as the three segments in the oil and gas chain utilize an open network of freely moving, up-to-the-minute information in a centralized location with a consolidated data stream, midstream can better anticipate what’s coming down the pipe.

Hydrocarbon transfer asset management and maintenance
Hydrocarbon transfer systems present midstream companies with another valuable opportunity to hone processes, not exclusively in their day-to-day operation and use, but also their upkeep over time. Lease automatic custody transfer units move resources from buyers to sellers and give all parties involved visibility into previously transported volumes.

So, why has such important equipment appear to have fallen by the wayside as the rest of the oil and gas industry at large appears so open to other forms of modernization? A study by Rockwell Automation found a predominant number of LACT units rely on “snail mail” to report supply information between seller and buyer, leading to an epidemic of accounting errors. Additionally, maintenance plans for these hydrocarbon transfer assets are usually just as outdated. Manually managing these sites with little to no technological integration and data reporting means midstream businesses charged with their oversight waste valuable resources visiting remote LACT units with no problems while neglecting those that do.

“Operators can coordinate LACT unit visitation data to create criticality rankings.”

Midstream companies ought to invest in remote monitoring capabilities to prioritize maintenance work orders and bolster resource accounting in one fell swoop. Operators can coordinate LACT unit visitation data to create criticality rankings for each site in terms of which receives immediate attention and why. Digital resource reporting capabilities will also prevent miscommunication between buyers and sellers, decreasing or eliminating altogether the resources expended to amend breakdowns in accounting.

Midstream oil and gas operations can serve of two masters, so long as companies pay particular attention to how innovation can streamline data flow and simplify their responsibilities in intelligent ways.

Industries from biopharm to oil and gas are abuzz with praise for continuous processing technology and the advantages the model brings to their businesses. Traditionally, these advancements take less time, consume less energy and usually have a smaller operational footprint to batch production, depending on the industry and assets utilized. From there, many businesses have seen significant Opex cost reductions, productivity gains, and alternative value-add opportunities.

Hype surrounding continuous processing can be particularly difficult to examine objectively, especially for decision-makers lacking the technical expertise while trying to determine if certain batch processes under their purview are worthy or in need of an upgrade to continuous status.

Do your operations fit the criteria below? Then it may be time to switch. Or, perhaps given what you learn, you must develop other areas first before taking the dive into continuous processing to gain and sustain its benefits.

Continuous processes ‘heating up’ in biopharm and chemical processing
If the science matches up, your company could be a prime candidate for continuous processing. Researchers from the Agency for Science, Technology and Research in Singapore published a study demonstrating how exothermic and endothermic liquid-phase reactions occurring in pharmaceutical or chemical processes could prosper greatly from continuous production methods over batch.

A*STAR scientists noted biopharm companies and chemical producers utilizing the Reformatsky reaction, a “organozinc-catalyzed reaction that frequently overheats with batch processing,” could find value in continuous processing. Using continuous methods in this way, companies could save on labor and resource costs, retain high uptime rates, uphold product quality, and perhaps even leverage efficiency as a means of lowering prices for consumers.

Will continuous processing give you IT nightmares?
A recent Automation World survey conducted for its advertisers inadvertently revealed several crucial differences between business leaders operating continuous processes versus batch processes. In sharing the results, the publication has provided on-the-fence decision-makers with powerful insights into what process changes could mean for their business at large.

The survey found more readers working with continuous processing worried about “technology upgrades” and “cybersecurity” than those working with batch processes. While correlation does not imply causation, Automation World Director of Content and Editor-in-Chief David Greenfield who wrote the accompanying article for the survey raised important points on-the-fencers should not take lightly. With an increase in technological innovation, connectivity, and interoperability gained through the incorporation of cutting-edge continuous processing equipment, the companies capitalizing on it are more than likely to possess a naturally increased awareness for the possibility of system breaches. That said, if your organization already struggles with cybersecurity issues under a batch regime, perhaps it may be best to devote attention to those gaps first before pursuing continuous processing and the tech that makes it possible.

Continuous processing removes many inefficiencies batch producers have struggled with since the dawn of modern industry. However, implementing continuous processes without proper foresight could backfire. Be sure to research how continuous processing has made an impact in your specific industry before integration, if you wish to glean a competitive advantage.

Natural gas production has remained stagnant even as the nation creeps toward cooler weather. Instead, processing plants have begun to increase the variety of products in their portfolios, investing in asset infrastructure for purifying natural gas liquids.

But what does diversification like this mean, especially to an industry focused on cutting Opex costs and optimizing production? What concerns should stay at the forefront of midstream investors’ minds when installing, expanding, or reconfiguring NGL fractionation and distillation equipment?

Plan for market agility through asset utilization
Although the low cost of natural gas may be of benefit to gas-fired energy generators across the country – especially as air conditioning demand trends upward, according to Reuters – companies entrenched in the oil and gas industry must find new methods for capitalizing on goods without saturating the market. Extracting pentanes and other worthy hydrocarbons from NGLs prevents natural gas organizations from tapping extra wells and using the most of the production already available to them.

However, as midstream operations spin ethane, butane, etc. from NGLs, asset expansion necessary to control these varied resources only stands to complicate processing and open up room for mechanical failures, product mishandling, and perhaps even regulatory noncompliance. Additionally, a diversified stock so reliant on domestic and export market performance requires responsiveness to remain a boon to business. When one outperforms others, decision-makers must be at the ready to tilt production accordingly without compromising quality or service.

Maintain cost-effective energy consumption
Industry leaders know distillation columns used in NGL fractionation burn a lot of thermal energy, with as much as 40 percent used on site for “refining and continuous chemical processes,” according to the U.S. Department of Energy.

Labor reductions to extraction upstream trim production to avoid market saturation, but these austerity measures also aim to deflate costs throughout all oil and gas operations while prices remain low. Adding energy-intensive assets without taking energy expenses into consideration may undermine cost-cutting initiatives elsewhere. Apart from balancing the books and ensuring the difference in operational growth doesn’t derail Opex cost reduction, what else can NGL producers and processors do to mitigate how much distillation may grow their energy footprint?

One method, according to the American Institute of Chemical Engineers, involves targeting energy variability through the establishment of pressure controls, particularly for light-hydrocarbon columns. Researchers found even a 7 percent reduction in pressure could yield the typical distillation process a savings of $240,000 in annual energy costs. Moreover, advanced condenser mediums capable of balancing distilled resources at the perfect temperature could more than double those gains.

Oil and gas companies ought to concentrate more on how they run their distillation towers.

Avoid distillation column misuse
How fractionation towers function in a more general sense also matters, especially if on-site NGL distillation has undergone maturation because of process mapping and other physical changes to the layout of a facility.

For instance, Chemical Processing reported how many refiners focus too heavily on condensers and reboilers when they should give equal consideration to column feeds and how they perform around entrant trays. A misplaced feed could force fractionation towers to work overtime and increase their energy demand unnecessarily. This mistake may also cause asset failure due to imbalance, compromising safety and the quality of the product therein, as well as every other NGL that would have been harvested down the chain.

When altering process organization for greater operational efficiency gains, don’t alter feed locations unless data confirms the move won’t jeopardize asset availability and uptime. Remember: Distillation towers are almost the perfect embodiment of the domino effect. If one column becomes compromised, you will almost assuredly lose all others until the problem is remedied.

Fractionation presents natural gas with horizons to conquer and opportunities to turn market troughs into progressive growth as companies expand the scope of their operations. Before integrating new distillation assets or changing how you use the ones already on site, discuss your plans with a knowledgeable consultancy, preferably one with a specialization in continuous processes and utilization, as well as one with a proven track record in the field of oil and gas.

How can a business in the chemical processing sector maintain a commitment to continuous improvement as its industry undergoes a period of financial stagnation or decline?

Optimization Realization 
Optimization isn’t an abstract idea – it’s a discipline rooted in real process changes and data-driven exploration into on-site and remote operations. After all, when companies expend the effort to improve, they’re not competing against competitors per se. They’re competing against the best version of themselves.

Unfortunately, that level of dedication does not come free. According to the American Chemical Society, many key players in the chemical processing sphere experienced turbulence coming into 2016 and throughout the first half of the year. Some stressors were milder than others, but all contribute in some way to a decrease in revenue and could impact how they optimize and innovate:

The question is, are there methods for sustaining optimization initiatives even when budgets are tight and if so, where should chemical processors focus their attention to derive the best results?

Be wary of diversification
Reaching out into new markets may open new frontiers for companies when business is up, but when sales plateau product diversification needs serious consideration. Deloitte research revealed that as industrial production contracted in 2015 across all industries, chemical demand also waned. In response, many chemical processors took the opportunity to focus on retooling their core business rather than take risks with experimental projects. Essentially, they chose to optimize over maximize.

However, PwC chronicled the turmoil of a subsection of chemical processing that didn’t fall in line with this mentality: engineering polymers. According to PwC, one company tried to expand into new areas of business both geographically and through product diversification during this risky market environment. The end result placed significant strain on logistics resources, internal conflict, and supply chain disruption.

So, when it comes to product diversification, how can businesses tell the difference between a sure thing and a dud? By first investing in thorough, unbiased analysis, perhaps from a third party. However, if funds are tight, that money might be better spent on next-generation productivity through an overhaul of the core processes underpinning company culture. Doing so has shown to reduce operational expenditures, thus freeing up more spend for opportunities at more stable junctures.

Understand customers – and yourself – through data
The age of big data is both a blessing and a curse for optimization in chemical processing. On the plus side, it presents an opportunity to forecast fluctuations in demand, materials performance, and internal operations charged with capitalizing on these elements. By leveraging the most actionable data management strategies, chemical companies have the power to amp up their services and dive deeper into the nuances of their industry like never before.

The problem is, so can everybody else. So, while big data can help an individual business accomplish their goals, it simultaneously raises the bar for all industry players in regards to what clients expect as the status quo.

With that in mind, chemical processors should tailor all optimization initiatives toward retaining the customers they already have, instead of playing to the clients in their competitors’ pools. Focus on what separates your company from others, then strive to optimize those services as much as possible. Also, don’t settle on what services you do best, but rather what services you do differently that resonate with your customer base.