Therapeutic Dose: The Importance of Design Qualification
In my last column, I spoke about measuring the correct parameters of raw materials, not just the traditional compendial tests. That observation carries over to all of the processing steps for solid dosage forms, too. We in the pharmaceutical industry are in a unique position, albeit, not a flattering one.
Virtually all other industry process chemists/engineers/analysts are acutely aware of what is happening in the production stream of their respective companies. They intimately know the input variables, the chemical reactions taking place, the effect of catalysts, the side reactions, and how to affect product yield. Despite the complexity of any reaction, it is, after all, merely a chemical reaction. It follows rules we all learn, starting as sophomores in college.
Before my synthetic chemist friends go ballistic, allow me to explain. All chemical reactions follow well-known paths even the synthesis of drugs, even though pharma people make believe that they are doing magic. The understanding of what is in the pot at any time allows them to purchase analytical instrumentation that will provide control throughout the process, no matter how complex. This leads to standardized instruments and a general lowering of prices of those toys.
After all, cracking of petroleum is the same for ExxonMobil, as for Hess, as for Getty, as for Shell, etc. When you work out the design for the reaction monitor, say for octane number, everybody buys the same one. Thus, organizations such as ASTM can give guidelines for the testing of materials such as petroleum.
Would that were true for pharmaceutical solid dosage forms as well. The current testing is (per my last column, "PAT in Perspective: Safe? Yes. Effective? Not So Much." ) chemical for a physical mixture of powders. We toss them into a blender and spin, shake, rattle, and roll them until we believe, or hope, that there is a resultant well-dispersed (as a physical chemist, I refuse to refer to a powder blend as homogeneous) mass of powder. We can wet and dry this mass to make magic nuggets that can be tableted. We then press it into tablets or fill it into capsules and hope that we are within 10-15% of the label claim. Hmmm
Under the PAT initiative, also aptly named Quality by Design (QbD), we are encouraged to monitor critical parameters of a pharmaceutical process. The problem is, if we knew what the critical parameters were, we would already be monitoring them. Therein lies the dilemma: How can we find an appropriate monitor to measure what we dont know or arent sure of? Lets quickly examine what we have in place at the moment, shall we?
To perform the analyses that we do currently perform, we have a pretty thorough qualification procedure for lab instruments at the moment. We have all the needed documentation and procedures for the installation of the hardware and its associated software, cleverly named Installation Qualification (IQ). This is largely based on the specifications of the instrument manufacturer and normally performed by its representative. This usually entails power requirements, environment, etc. In many ways, this is similar to ISO 9000 requirements: Is the instrument we have compliant with the blueprints for that instrument? But does it run?
The Operational Qualification (OQ) is as it sounds: The instrument performs its functions as prescribed by the manufacturer. That is, it can scan, pump, blink, and chirp just as described in its manual. This testing is a cooperative effort between the sales engineer and the analyst. Virtually any instrument manufacturer worth his salt will have documents that carefully describe the instrument's functions and how to check the operation of each function of the instrument. This will include the calibration procedure for cGMP compliance. For spectrometers, wavelength and linearity standard materials are included (at a nominal fee, of course). Now, we have to prove that it works in our application.
The effectiveness for our analysis is called the Performance Qualification (PQ) and is heavily dependent on the user. It is, in short, a document demonstrating that the unit performs the given analysis per the written SOPs.
All these Qs were, of course, designed for laboratory instruments. When an analyst purchases an HPLC, for instance, he/she knows that the sample will be dissolved in a solution for injection. In this case, one size fits all is appropriate. The difficult part of the setup will be choosing the column and mobile phase, since the hardware is standard. This goes for spectroscopic instruments as well. Mid-range infrared samples are usually mulls, thin (liquid) films, or KBr disks. In the ultraviolet and visible ranges, solutions are made from the samples to be analyzed. Even near-infrared, in a lab setting, usually involves powder cups, dipping fiber optic probes, or tablet holders. All this is like shooting tame birds and requires little choice beyond which company gives good service in your area.
All this is fine as it stands, but one other Q seldom mentioned is the Design Qualification (DQ). The instruments to be used must be designed for some sample paradigm. For lab work, this is pretty straightforward and, for the vast majority of instruments manufactured, this portion falls squarely on the backs of the instrument companies. They attempt to listen to the market and design what will fill a void. Often, these are for the same tests as have been performed in a lab for decades; the new instruments are merely faster, more sensitive, easier to operate, more durable, etc. The input of the consumer is usually just which cuvettes or detector (e.g. HPLC) is required or, now newly discovered by analysts, fiber-optic probes.
This is all well and good for lab work, but now we are trying to do process analysis and its a whole new ballgame. For simply automating a lab assay for at- or on-line work, many reputable companies can easily supply a case-hardened instrument. If we are merely interested in the percent moisture or content uniformity of the API, dozens of instruments spring to mind, all of which will perform nicely.
But (dont all my columns seem to have a "but"?), what about measuring the parameters that we are becoming aware of being truly important, not the same-old, same-old? How do the converted one size fits all lab analyzers measure the physical parameters we are finding to be the real controlling factors? At what cost in time and money?
One beauty of current hardware is that they may be used for a myriad of tests. They are, as my co-instructor (Jack Carroll) named them, genius machines. Unfortunately, when you are measuring one simple (or complex, but one) parameter, it is desirable to have a monitor (or per Jack, idiot savant) that does that one thing very well, repeatedly and inexpensively.
This is where the materials scientists and engineers and physical chemists come in. The instrument companies are good at one thing building well-behaved and reliable instruments. They cannot be expected to do our R&D for us. They need to be told what we are looking for in the new paradigm of PAT (i.e. QbD). In fact, most normal instrument manufacturers may not have the personnel to make specialty hardware and software for, say, monitoring the degree of crystallinity of the API and excipients during a blending operation. This sort of design may take physicists, optics engineers, software writers, mechanical engineers and spectroscopists. Most instrument companies are too small to carry this large staff for specialty orders. (Later, when the specialty instrument is sold in the thousands, they may be the best candidates to produce them, however.)
Hence, like software, third party vendors will make an impact here. I am aware of a number of engineering companies capable of (and good at) this type of work. I was pleased to visit one recently in Finland named VTT (Finnish Research Institute). VTT is quite an impressive operation. The people there showed me an array of machinery that can construct anything I could imagine. Tie that in with the engineers and scientists and, well, you see what I mean.
It may well be that PAT instrumentation for these new parameters will not be off-the-shelf instruments, but will have to be designed as a cooperative effort between the pharmaceutical scientists and specialty engineering firms. The D in DQ stands for design not what do you have in the showroom? but design!
Remember those head-count additions I mentioned last month: engineers, materials science-types, physical chemists/pharmacists? Put the ads in the Sunday papers were gonna need them sooner than later. We will need to ascertain what parameters we need to measure, then control. We will then have to sit down with either current instrument companies or engineering firms and actually work together on hardware and software that will give us the control of our processes that will enable us to design quality in to our products. Then we set parameters for raw materials, when we know what to measure and have designed instruments to make the measurements correctly and accurately. Piece of cake, no?