A New Pathway for Process Improvement

By Ronald Ortiz, Director Six Sigma, Pacira Pharmaceuticals, Inc

Ronald Ortiz, Director Six Sigma, Pacira Pharmaceuticals, Inc

Creating electronic simulations of the manufacturing process opens up new pathways for process improvement that are faster and less costly than traditional approaches and can prove to be a competitive advantage in today’s business environment. This approach goes well beyond statistical process controls (SPC) and process capability (cpk) charting getting down to the complex interdependencies between key process parameters inside modern manufacturing.

“Having access to and being able to analyze data from sophisticated manufacturing lines is a well discussed topic but with very few widespread commercial off the shelf solutions”

Digital Modeling may seem like a complicated endeavor when in fact it’s no more than what we may experience as car owners. If you drive long enough chances are at some point you will experience the dreaded check engine light. After a brief moment of terror as you brace for your engine to explode you eventually reach an inescapable conclusion; it’s time to take your car in for service. Unfortunately most car repair scenarios tend to lead to high levels of anxiety especially when the problem is first encountered and there is little to no data other than the glaring amber light on your dashboard. Ironically, that glaring amber light however can turn out to be a symbol of hope. .

Since 1996 performance monitoring for automobiles has become standard equipment through an On-Board Diagnostic (OBD) system to monitor and log “failures” encountered during operation, i.e. driving. The first thing your mechanic will likely do is connect to your car’s OBD system to access the error codes associated with the check engine light. This type of access to data and information changes the scenario completely. What began as fear over a potentially expensive car repair can often turn to immediate relief by quickly learning what was needed was simply tightening a loose gas cap.

Performance diagnostics on the spot and at the equipment source are incredibly valuable. Saving money and time through smart diagnosis improves quality. This leads to less corrective maintenance and greater throughput with more operating time. Surprisingly, there are few diagnostic systems for manufacturing environments outside of the automotive industry. Having access to and being able to analyze data from sophisticated manufacturing lines is a well discussed topic but with very few widespread commercial off the shelf solutions. Industry regulatory bodies like the FDA for instance are beginning to push for more and more statistical based analyses such as continued process verification (CPV)for key process parameters during pharmaceutical manufacturing. The reason for regulatory control is clear, but the idea is deceptively simple–there is a direct connection between process performance and the quality of your final product. Eventually not having a diagnostic measuring system in place will have to be addressed not only for regulatory purposes but to continuously improve manufacturing performance.

In response, companies in most industries have instituted some form of operations excellence using statistical process controls (SPC) to monitor process performance to address the need for tighter quality controls. Most will believe this is more than enough effort. Most will be wrong. Real value lies well beyond SPC and process capability. SPC does a fine job of serving as effective signals for identifying something that has failed or is trending to failure. Having trends and controls in place however falls short of dynamic understanding of the current process. SPC helps focus attention but rarely does it offer insight as to how to correct the failure in the first place or more importantly prevent them. This is because SPC are primarily static snapshots of what has already happened and not the reasons why it might be happening in the first place. 

Considering the high-tech aspects to many industries it’s a rather odd situation on how to proceed. The systems that control the vast number of modern manufacturing equipment are computer based with built in protocols designed to execute and monitor all manner of parameters such as time, temperatures, pressures, and mixing speeds to name but a few. Sometimes these parameters are recorded as a single discrete value in a logbook, sometimes they’re printed off as a run report but most of the time they’re ignored. The opportunity lost at not being able to leverage the vast treasure trove of data by failing to translate it into actionable information can be measured in the cost of poor quality, lost revenue and eroding market share. When presented in such terms, digital process modeling becomes a recognizable competitive advantage. The ability to model and predict how well a manufacturing process is executing during operation can directly lead to bottom line savings and should be part of any good business case for investing into a software modeling system with the goal being geared toward squeezing that extra bit of needed quality out of the process.

Digital modeling offers closer to real time parameter measurement. They are equation based predictor models that can be used to tweak process variables to observe impact on the desired quality characteristics. This is done in conjunction with all the other process parameters simultaneously and without having to waste materials during the manufacturing process. For instance, say you were interested in the ratio of raw materials placed into solution versus the impact on a product’s yield. On a bench scale this was measured in say grams plus or minus a few but at large scale manufacturing the measurement is now metric tons. Is the same level of precision required? Is the recipe now 1.5 metric tons plus or minus a few grams? Can that level of precision even be achieved? What’s the yield variability at full scale versus the original bench scale? Imagine informing the manufacturing team you plan to shut down the production line for two weeks to run a series of experiments using actual raw materials without any of the product produced being commercially available for sale to answer these questions. Now imagine being able to run those same experiments in a digital model on a computer to gain the understanding necessary for implementing the changes during a routine next maintenance interval. Which scenario would you rather be in?

Digital modeling can be that powerful. When implemented right, it can create information quickly from understanding and quantifying parameter combinations to run processes for optimal performance. It can also be used to perform electronic Designs of Experiment (DOE) to test equipment changes before actually implementing them on live systems; saving both time and cost of development. Even parametric release can be achieved whereby product quality is verified through operating parameters meeting optimal specifications further reducing time and cost in testing cycles.

Every system and every process has variation. The degree to which you can understand the magnitude of each operating parameter’s inherent variation has on the end product the better, faster, less costly you will be able to manufacture.  Whatever your definition of quality, if it can be measured, it can be modeled. Unfortunately the mindset of today’s manufacturing environment needs to recognize this opportunity and begin leveraging their data. If this prospect is not realized, you might find yourself one day having to explain to the General Manager that the equipment run failed for no known reason only to have an expensive consultant come in and discover that all that was needed was to tighten the “gas cap”.

Read Also

Lessons in Change

Lessons in Change

Wayne Kubick, CTO, Health Level Seven International
Crowd Sourcing for Healthcare Innovation

Crowd Sourcing for Healthcare Innovation

Benson Hsu, Vice President, Data and Analytics, Sanford Health

New Editions