Just as manufacturers have moved from clipboards and hand-written data collection to automated electronic systems for batch records, many are doing similar things for field instrumentation calibration. In addition to being less labor-intensive, these newer approaches make the documentation less prone to error and more suitable for presentation to regulatory groups.
This dovetails with changes on the verification side. FDA/ICH Guidances for Industry describe verification as a continuous process with three key elements:
- Detect unplanned departures from normal operation
- Collect and analyze product and process data related to quality
- Maintain facility including qualification of equipment.
The third element includes instrumentation calibration. It is nonsensical to imagine operating a critical manufacturing process or plant without verifying the information coming from instruments as true and accurate. If a process has been validated based on a specific reaction happening at 85ºC (±1 ºC), verification ensures the temperature instrument monitoring the reaction is accurate and capable of delivering a reading with an appropriate degree of precision every time.
• Temperature: dry block test
• Pressure: manual or automatic pump used to create simulated air pressure
• Level: varies by instrument type, with degree of difficulty ranging from low (dP instruments) to very high (time-of-flight instruments)
• Flow: varies by instrument type, with degree of difficulty ranging from low (dP instruments) to very high (mag meters and other non-contact instruments)
• Analytical: requires simulation of the media property which the instrument is measuring, such a calibrated bath solution for a pH sensor
Understanding Calibration Regulatory Requirements
The Code of Federal Regulations Title 21, Part 211.68 says automatic, mechanical or electronic equipment can be used in drug manufacturing, and if it is used, “It shall be routinely calibrated, inspected or checked according to a written program designed to assure proper performance. Written records of those calibration checks and inspections shall be maintained.”
This regulation places the burden for creating a compliant calibration program on the facility. It does not specify how or how often any specific instrument must be calibrated or checked. At the same time, when the facility is being inspected by a government agency for regulatory compliance, those details will certainly be examined. A facility will have to defend its calibration practices within its larger validation and verification programs.
So ask yourself if your calibration program is appropriate for the instruments you’re using today, or if it still reflects the needs of less sophisticated process instruments of years past.
Streamline Calibration Record Keeping
Electronic records for product movement and manufacturing have largely replaced expensive and troublesome manual techniques. Unfortunately, in a surprising number of facilities, manual recordkeeping for process instrument calibration and maintenance persists. Since calibration tasks tend to be manual by nature, recordkeeping by hand often follows, but better methods are available.
Just as electronic batch records improved manufacturing processes, they can also do the same for calibration. Consider the typical steps for a temperature sensor:
- Find the appropriate instrument, identify it positively, and remove it from its mount
- Find the relevant test and calibration protocol for the specific instrument
- Heat up the dry-block calibrator to the first test point and take the first reading
- Reset the dry-block temperature, wait for it to change to the new value and repeat the reading
- Do this as many times as specified, typically five test points for a critical instrument
- Write the relevant information on a test sheet, or type it into the computer terminal
- Write out the calibration sticker and return the instrument to service or the stockroom.
A technician working under pressure and in a hurry might inadvertently write down incorrect information. Other typical mistakes include testing the wrong instrument, testing to the wrong set of calibration points, testing against an incorrect standard and so on. In any situation where a human being has to read and write numbers, mistakes will be made. While automating calibration to the extent of removing the human element entirely isn’t possible for all instruments, there are ways to automate calibration and recordkeeping.
Smart Instruments Simplify Procedures
Smart field instruments have the ability to communicate well beyond simply sending an analog process variable. Protocols such as HART and WirelessHART provide the means for a smart transmitter to store a great deal of configuration and calibration information, including:
- Its own tag number
- Measuring units and range
- Calibration history
- Calibration test points
- Self-diagnostic functions
- Alarm points, and more.
These capabilities have greatly improved the calibration picture. A smart transmitter can be linked to control and monitoring systems via a digital data network, which can communicate its internal diagnostic information along with the basic process variable. Since verification is a process rather than an event, it is possible to monitor the condition of all process instruments on a unit continuously while the process is running. Internal diagnostic routines can warn of a problem developing with any instrument.
Calibrations still need to be performed, but they become opportunities to verify known performance, rather than to correct drift and errors. Calibrating a given modern smart instrument should thus not reveal any surprises.
For an increasing number of process instruments, the diagnostic routines built into the transmitter are able to examine the sensor element itself and find changes capable of affecting performance. The nature of these capabilities varies from one technology to another.
For example, the way a Coriolis flow meter measures liquid movement is much different than the method by which a capacitive sensor measures pressure. Nonetheless, each instrument knows what it should see in normal operation, and any deviation indicates something may be wrong, and it can warn operators and maintenance personnel accordingly.
Modern Calibration Procedures
Many calibration programs are built around old and often obsolete assumptions. In years past, when a new production-grade mechanical pressure gauge was going to be installed in a production unit, it went first to the calibration bench. The technician would compare it to a 4A certified test gauge, and if there was any deviation, it was a simple matter to open the case and make adjustments via setscrews. This approach is built on three key underlying assumptions:
- The gauge needs to be checked and possibly adjusted because it can’t be trusted out of the box
- A qualified technician can improve the instrument’s performance by tweaking it
- Regular ongoing calibration at short intervals on the bench is necessary to verify it is not drifting out of its measurement tolerance range.
The reality today is much different because none of those assumptions are correct. Here’s what should happen now with a new smart pressure transmitter:
- The technician takes it out of the box. It has been calibrated at the factory to a standard far more precise than a 4A gauge, and there is a certificate documenting the actual calibration process. An electronic version of the certificate can be uploaded to the calibration database
- The technician can check it on the bench if required, but there are no setscrews to tweak
- Anything the technician tries to do to improve calibration will only degrade performance. Unless the transmitter has been damaged, it should be installed as-is
- The technician may need to do some more configuration steps, not calibration, but these are done electronically through a computer or hand-held communicator
In day-to-day operation, a quality pressure transmitter will exhibit great stability over long periods of time, and should a problem develop, self-diagnostic functions will detect it.
With today’s smart instrumentation, calibration is typically not something a technician does to fix a problem, but instead it is used for verification of correct function. The difference may be subtle, but the implications are huge. Some users embrace these capabilities and realize smart instruments provide the means to reduce the amount of required calibration, while still maintaining reliable operation within a validated process per regulatory requirements.
Some Calibration is Always Necessary
The capabilities discussed may reduce the frequency of calibration, but some calibration will always be required. Any component or system can fail, and pharmaceutical manufacturers need to be especially vigilant to make sure everything is operating as designed. Moreover, few facilities have the most technologically advanced instruments installed in every application. Most process units have a mix of mechanical, basic analog electronic and smart instruments — and a calibration program needs to address all these instrument types.
Different types of smart instruments require different calibration techniques. Obviously a temperature sensor has to be tested differently than pressure instrument, but other elements, such as process criticality, also play a part in determining how to handle the checks.
Table 1 lists a variety of common smart instrument types, and gives a short description of how each is calibrated. Most are tested by simulating a process condition. Pressure and temperature instruments are relatively easy to check because it is easy to create an appropriate simulation. Flowmeters are more complex, particularly large ones, since creating a controlled and calibrated flow is not as easy.
Farming out calibration might involve bringing technicians to the site to perform tests. Some instruments may have to be sent off-site to the manufacturer’s base facility or lab, although many third-party instrument calibration facilities have mobile calibration rigs.
For instruments needing to remain in place or difficult to remove, calibration is often performed in-situ. This can limit the types of actions possible as it is not always practical to bring certain types of calibration equipment into the field. But for most instruments used in pharma manufacturing, a portable or hand-held interface, such as a HART communicator or a specialized calibrator, can do the job. A HART communicator allows technicians to interface with an individual HART-enabled process instrument. Using the 4-20 mA wiring, it can access diagnostic information, change the configuration and read historical information stored in the transmitter.
Hand-held calibrators have a wider range of capabilities related to calibration functions. Some are more elaborate than others, so individual models do not necessarily have all possible options. Calibrators perform the same basic tasks as a communicator, but add other process simulation abilities, such as a small air pump to test pressure sensors, or sophisticated electronic controls to simulate signals from thermocouples or RTDs.
Calibrators can be loaded with appropriate procedures for the instruments to be serviced during the specific round or shift. They can record all actions in the field, and then sync with an asset management or calibration maintenance management system to transfer field activity. Manual actions are minimized.
Whether bench or field-based, calibration is no longer solely a mechanical process with modern smart instrumentation. Sophisticated electronics have extended the amount of information on an instrument’s condition, which now often extends to specific calibration issues. The future where process instrumentation can be verified electronically, remotely, continuously and automatically is already here for many smart instruments.
• Requires less equipment
• Faster
• Less labor intensive
• More reliable
• More repeatable
• Self-documenting
Electronic Diagnostics/asset Management
As mentioned earlier, smart instruments are capable of sending diagnostic information about various aspects of their condition and performance. Using this information in a calibration program or for condition-based maintenance requires monitoring the most important attributes for a given instrument and acting appropriately. Trying to do this manually can quickly become overwhelming with an instrument population of any size.
Effective companies automate the process of sorting through the information by using an asset management system (AMS). Each smart instrument has its own record in the system (Figure 1), and the AMS can communicate with each instrument through plant networks. Using a communication protocol such as HART, WirelessHART, or a mix of both, the AMS can poll each instrument following a pre-determined interval based on criticality. Each attribute has its appropriate operating range, and any critical deviation can be set up to trigger an alarm.
Some diagnostic attributes are purely the concern of the maintenance group, while others can affect the instrument’s ability to deliver its measurement accurately or reliably. For example, consider a typical smart pressure instrument. It is designed to read a range of 0-50 psi, and it sends this signal via a 4-20 mA current loop, with a reading of 46 psi translating to 19 mA. If there is a malfunction in the power supply to the instrument such that it cannot provide more than 15 mA, the instrument will continue to work, but it cannot indicate its full range. The upper third, roughly, cannot be communicated. A smart instrument can recognize this deviation and send an appropriate message or alarm through the AMS.
Such a situation is not part of the calibration program per se, but it is critical information related to the instrument’s ability to function properly, which is part of the larger verification program. Under normal circumstances, diagnostic information from smart instruments generally says the instrument is functioning correctly, and therefore there is no reason to doubt the information it is transmitting. If a problem develops, the AMS can warn of it immediately.
Hand-held communicators and portable calibrators can interface with the AMS directly, transferring information, instructions and data in both directions. Communicators and calibrators thus become critical extensions of the AMS. The AMS becomes the main record-keeping mechanism for the larger instrument calibration and verification system. “Dumb” electronic instruments and mechanical instruments can be included in the AMS, but some information for those will have to be handled manually.
Conclusion
Consider what it was like owning a car 30 or 40 years ago. A 1975 model or older was far more maintenance intensive, needing tune-ups, spark plugs and oil changes more frequently than today’s vehicles. No sensible driver continues to perform all that service on a current model car because it doesn’t need it. The same is true of today’s process instruments.
Calibration is still necessary, but the sophistication and stability of instrumentation today makes it far easier to work with and less costly to maintain. Manufacturers can work with instrumentation suppliers, like Emerson, to modernize practices and capture significant cost reductions and labor savings, while still meeting all process validation, verification and regulatory requirements.