We’ve all seen articles speculating on pharmaceutical Quality by Design’s impact on drug development and speed to market. However, speed to market is simply part of a larger strategy to maximize profitability. The sooner the product is on the market, the sooner research and development costs can be recouped.
There are costs associated with accelerating late-stage development, scale-up and tech transfer to manufacturing. Isn’t what we’re really talking about cost to market versus the competitive advantage of getting there sooner?
Where does QbD fit in? It isn’t about getting to market fast; at least not directly. QbD is about getting to market reliably. QbD is about knowing enough about the limitations and risks associated with formulation and production methods in order to establish appropriate mitigation and contingency plans.
Organizations that go to market fast with limited formulation and process knowledge risk disruptions whose cost and time losses will quickly outstrip the advantage of being there early.
The case study that follows will allow us to see both the cost of disruption and get a good estimate of what it might have cost to have applied QbD methodologies earlier in the life cycle of the product. It will also present manufacturers with options in choosing the right point in the life cycle to apply QbD, and with solutions for what to do with a problematic product already on the market. Finally, I will discuss how the application of QbD tools and tactics is not just a vehicle to improve cost but represents a significant opportunity for organizational development that improves cross-functional coordination in product development and lays the foundation for QbD to occur earlier in the development process.
Case Study: A Closing Window
A major pharmaceutical manufacturer had an opportunity for six months of patent exclusivity against generic competition if it could launch a controlled release product extension by a given date. Annual sales were projected to be in the neighborhood of $100 million—not a blockbuster but still a significant opportunity, since the six months would enable $50 million in sales.
Because this product was controlled release, its production process would be complicated but the developer’s project plan to scale up, validate and transfer the process for the line extension to manufacturing put it comfortably within the launch window. As such, Sales and Marketing obtained purchasing commitments from their distribution channels based on Operations’ commitments to fill the pipeline and maintain stocking levels.
The product was in commercial production mode and launch quantities were being produced to fill the pipeline when its dissolution rate began to trend out of specification. While the origin of the problem was unknown, a stopgap measure in production mitigated the effect to some extent. The downside was that the stopgap measure resulted in significant yield loss. Batches lost due to outright failure and yield losses due to this measure cost $250,000 a month as the organization scrambled to produce launch quantities in the face of having to replace the lost production time and materials. Not only was it costing more to produce the product, the launch window was now in jeopardy. This not only put the $50 million in exclusive sales in jeopardy, but agreements with distribution channel partners exposed the organization to potential penalties of $400,000 a day if it failed to meet its stocking commitments.
As the company’s Operations department dumped more money and resources into expediting materials, rescheduling production and working overtime, QA and Development had to divert resources to address deviation reporting and troubleshooting. Meanwhile, the regulatory department wrestled with the question of whether it should file a CBE-0 to institutionalize the stopgap measure which resulted in much lower yields and, therefore, much higher production costs and lower profits.
So, here is where QbD reenters the conversation. Quality by Design is nothing more than risk mitigation through process understanding. It’s pretty obvious that the earlier in development and commercialization that this process understanding occurs, the sooner the risk is quantified and, if not mitigated, at least understood such that appropriate management and cost allocation contingency plans can be put in place. This case illustrates the painful and multiple collateral effects across the organization when risk ends up being identified through unplanned failure.
Calculating Costs
In order to estimate what QbD would have cost in this case, consider the activities, resources and cost associated with the emergency application of QbD methodologies to troubleshoot and correct the problem. Four full-time employees were deployed for fifteen weeks. The team was comprised of a formulation scientist, a process engineer and a statistician led by a senior project manager with experience in all three of these areas. With the exception of their full-time dedication and focus on this one project, the team employed tools and tactics much like those used by a team pursuing proactive QbD.
The first twelve weeks were consumed by data collection, database construction, creation of control charts of input and output variables, hypothesis generation, multivariate analysis and the design of experiments to perform hypothesis screening and confirmation. Carrying out Design of Experiment (DOE) work consumed the last three weeks. It should be mentioned that in this case, as in many cases, the team needed to spend a disproportionate amount of time in data collection because much of the data needed to be manually transcribed and those that were in electronic format were spread among multiple databases.
The resulting process understanding yielded a solution which did not require continuation of the stopgap method with its associated yield losses and there was no need for a supplemental filing. The QbD methodology identified and confirmed that the dissolution problem was related to an API characteristic which was not previously thought to be critical. A change in suppliers had resulted in a subtle change in this characteristic which, while still within specifications, shifted dissolution performance resulting in all the costs and problems outlined above.
Fortunately, the stock out condition which could have pushed the organization into the $400,000 a day penalty was avoided. Product scrap losses, however, approached $500,000. Almost half of the six months’ exclusivity opportunity on $50 million in sales was lost.
The hidden costs of lost productivity in Operations, QA, QC, Regulatory and Development no doubt added tens of thousands of dollars to the total cost of unexpected failure. The emergency execution of QbD methodologies took four people 15 weeks, or about 2,400 man hours. Multiplying this by the fully loaded hourly rate for internal resources for an organization would yield a reasonable cost estimate.
The need, in this case, for external staff augmentation drove the cost higher, but even at that, the return on investment was more than acceptable. More importantly, if the same tactics and tools had been integrated with other work streams in late development, validation and tech transfer, the costs would have been lower and it’s very likely this whole fire drill could have been prevented.
Better understanding of risk allows management to predict and mitigate situations that can increase the time and cost of bringing a product to market. Gaining that understanding through proactive application of QbD methodologies is going to be less expensive than unexpected failure. It is one thing to make an educated decision based on science. It’s quite another to take a risk based on the fact that similar products made in the past did okay.
When to Apply QbD
Let’s agree that QbD is good business, but where should it be integrated in the development process? Last year, I would have said early in the pilot plant scale-up and validation process. While I would still encourage the development of process knowledge early in product development, I also recognize the emergence of a growing trend in the industry towards a hybrid approach. This approach seeks to maximize speed by prioritizing the critical path elements of clinical supply and validation submissions followed by application of the statistical and process characterization risk assessment tools of QbD during scale-up and continuing though manufacturing tech transfer and into the first year or more of the product life cycle. Data that better illuminate and expand the product’s design space can be included in annual reports.
Establishing the foundation for ongoing process knowledge improvement and the ongoing expansion of the design space is synergistic with PAT and addresses another problem common to prospective QbD in the industry. Pharmaceutical and biotech manufacturers demand and receive tight control of raw material characteristics from their suppliers. For this reason, the range of actual values for lots received are often very narrowly grouped well within the specification range. Even when asked, raw material and API manufacturers are hard-pressed to produce material much outside their normal narrow range and often refuse requests to do so.
So, while manufacturers can push the boundaries of the design space with respect to their controllable manufacturing parameters, it is often impossible to obtain material with characteristics which vary enough to even do a decent regression let alone define a broad design space. In fact, most drug product manufacturers are forced to validate the drug product with a narrow range of the API specifications. It is often only when a serendipitous change occurs that additional data can be generated which sheds more light on the relative importance of a characteristic and affords a chance to expand the design space.
The case study presented here is one of dozens with similar issues and results. The difference between this case and many others is that the disruption occurred relatively early in the product life cycle. We often see similar disruptions with large monetary impact occurring in well established products. In addition, we often see products where ongoing problems such as poor yields or occasional batch loss have cost the producer millions of dollars over a long period of time before finally being solved. Significantly, we sometimes find that characteristics or parameters tightly controlled because they are thought to be important, are not that important after all and the expense of controlling them is a waste.
What is common to all these cases is that when QbD principles, tools and tactics are rigorously applied, even problems which have plagued an organization for years can be resolved. Speed and the competitive advantage it affords are benefits of early life cycle application but significant gains can be made in risk and cost management through process understanding of established products. More importantly, the regulatory environment seems to be evolving such that FDA is more receptive to on-going evolution of specifications and controls when supported by QbD rigor.
What is also common to all these cases is that the effort to rigorously understand the process invariably results in organizational development not just in terms of increasing overall competence in statistically establishing cause and effect but, more importantly, in cross-functional organizational readiness.
In almost every case we’ve observed, we find the organization must begin with an exhaustive data collection effort to integrate data manually transcribed from batch records and certificates of analysis with data stored in disparate electronic databases. Then, often for the first time in the product’s life cycle, control charts are produced for all input and output variables. These, in and of themselves, often provide the perspective to screen out popular but erroneous hypotheses of cause and effect and, sometimes, catalyze the generation of new hypotheses.
Trends and shifts heretofore hidden are revealed and variables with very little actual variability are identified. Correlation and regression models not only substantiate or disprove cause and effect but also identify gaps in the data which make it impossible to explain process variability. In one case, the data from input variables for which an organization had data only explained about thirty percent of the variability in the system. Over forty input variables with the potential to affect product outcomes but were not being collected were identified and data collection plans put in place.
Data-Driven Insight
When organizations complete this level of exhaustive data screening and correlation analysis, they not only have insight into the real cause and effect relationships driving deviation and cost, they have a data-driven understanding of which variables require close attention and control and which do not. They have a new platform for data collection and trend analysis which provides advanced warning when critical variables are shifting. Equally important, they understand process capability and can differentiate between normal random variation of a process in control for which no action is required and real shifts which require immediate attention.
Finally, having completed retrospective QbD-based analyses, the organization gains insight into the type of variables and data they want to collect and trend from the outset of new product development to avoid the painful process of pulling data together manually while under the gun to explain an unexpected and costly deviation. This shared understanding has a collateral positive impact of identifying and reinforcing common purpose and interrelation between functional groups involved in new product development and tech transfer. This, in turn, better positions the organization to execute QbD to identify and mitigate risk earlier in the product lifecycle.
About the Author
J. Paul Catania is Managing Consultant with Tunnell Consulting, Inc. For more than 30 years, he has helped leading organizations achieve significant improvement in operational and financial performance. With international experience leading large-scale cultural change initiatives, he is an expert in Lean Six Sigma improvement methodologies and in functional areas including manufacturing operations, quality assurance, supply chain management, procurement and strategic sourcing. In the course of his career, he has held key roles at a number of leading life sciences companies, including Johnson & Johnson. He can be contacted at [email protected].