Dino Busalachi, Applied Group: “Applied Group provided consulting for a pharmaceutical company that recently acquired the assets of another company and found that the acquired company’s data capture was entirely paper based because the organization said it was concerned about data storage. But from our perspective their concerns were less about data storage and more about compliance change management, operations and the introduction to new technologies and applications. So to this client, having paper in their manufacturing operations was deemed the least risky way of managing data. But frankly, if you are paper-based, the information provided is very stale. Managers are getting information weeks or months too late to do anything meaningful with it.
At the heart of successful cGMP and QBD-based pharma manufacturing is real time operational data generated by highly networked control, sensing and analytical technologies across the process continuum. The volume of data generated by pharma enterprise information and manufacturing technologies is immense, but the availability of data is not the issue — what is at issue is the ability to generate high-quality data then extracting and shaping meaningful, relevant information from it, then delivering it to the organization in forms that serve business and process decisions.
The efficient flow of data and information from process to executive suite and back is dependent on a well-organized, modern data/informatics infrastructure. But for many organizations basic data input and information handling remains haphazard and antiquated, relying on paper records, subject to human-induced error. Similarly, access to decision-supporting information continues to be problematic, with knowledge kept in silos and behind other artificial barriers that ultimately impinge on efficient, cost-effective operations.
To gain clarity on IT-related issues facing the industry today, Pharmaceutical Manufacturing sought input from leading IT suppliers and consultants; convening a virtual panel to discuss trends in Pharma information technologies and reveal insight into how companies can better manage this aspect of their operations in pursuit of operational excellence and business success.
Pharmaceutical Manufacturing: In spite of commercial realities prompting closer alignment between functional executive ranks (i.e. facility/manufacturing managers) and higher order business intelligence IT operations, it’s perceived a gap still exists. In your experience, what has been effective at closing this gap and creating the alignment pharma enterprise needs to be successful?
K.R. Karu, industry solution director, Sparta Systems Inc.: “Closing the gap and aligning information within the pharmaceutical enterprise is best accomplished by globally harmonizing processes and identifying which systems will manage which subset of data. These systems should then be integrated so master data can be stored in one system and consumed by all. When global data is harmonized in a strategic group of systems, the data can be used by many for their particular function within the company.”
Dino Busalachi, practice director, Unified Manufacturing, Applied Group: “The gap exists and unfortunately, in our experience it is all too often the ‘norm.’ For many years, the executive suite dictated its information priorities for the overall organization and IT served those interests: hoever, corporate IT and manufacturing have generally not possessed an intimate understanding of each other’s needs and concerns, leaving the operations/facility leadership to their own devices to collect, deliver and leverage valuable operational intelligence.
However, the need for alignment between IT and manufacturing IT, supported by an integrated IT and analytics infrastructure, is critical — especially when it comes to the FDA and compliance. Owning and implementing an enterprise-wide intelligence solution has historically been capital intensive and challenging; it requires a close, ongoing relationship between corporate IT and manufacturing to deploy and sustain the solution. Given the often tenable relationship btween corporate IT and manufacturing, many executive boards have chosen not to invest the significant capital necessary to build the required infrastructure and implement manufacturing intelligence solutions.
Ken Rapp, managing director for Analytical, Development, Quality and Manufacturing, Accelrys: “During the course of our work with Pharma industry leaders, Accelrys has identified two specific pain-points — the independence and isolation of standalone vendors / applications and the inability to connect the data from separate application to separate application. This hinders new product innovation and decreases efficiencies because information valuable to new product development cannot be accessed and shared up and down the value chain. Information never becomes knowledge. We think addressing these two problems is the core to closing this gap in Pharma. Technology that effectively connects data and provides valuable knowledge up and downstream from product development through commercial applications is needed. We believe applying technologies that provide the ‘scientific platform’ to allow communication between disparate applications is the most effective way to close the gap in the Pharma industry.”
PhM: Most will agree that process data and record keeping is not managed as well as it could be in the Pharma space. Paper-based systems remain pervasive and record-keeping lapses have been identified as a major factor in compliance issues with regulators. But in spite of the obvious risk, it is feared that change will lead to tremendous expense associated with process revalidation and fresh exposure to regulatory scrutiny. Are these fears justified? And how might they be overcome?
Trish Meek, product strategist, Informatics Business, Thermo Fisher Scientific: “No, those fears are not justified. I think these fears are born out of experiences that happened in the past, but the technology has improved greatly over the past five years. The reality is that the cost of maintaining paper processes far exceeds the cost of integrating systems. The other fact to consider is the potential cost to quality [that stems from] not removing manual, paper based processes. The best a human being can achieve is four sigma for transcription activities. That means that for every 1000 results someone transcribes from an instrument they will make 3-6 mistakes. This error increases drastically to 3 per 100 if there is math or stress involved. If just one of these mistakes results in a batch being approved and then later recalled, or a failed FDA audit, the cost of this one mistake would far exceed the cost of the IT system that would have prevented it.
Forrest Rudnick, vice president, Vendor Management and Compliance, Datalynx-U.S.: “First and foremost, patient safety is the primary concern. This, above all other elements, cannot be sacrificed. I would have to include risk management to be addressed through business impact assessments audit and remediation efforts to improve the compliance aspect, which would be accomplished by a mindset alteration and effective and continual training. Removing the human element with validated systems and processes that provide repeatable and consistent output will yield a tremendous amount of data, either correct or incorrect, but consistent. So if incorrect, the fix would be relatively easy versus the human input and paper-chase system.
Implementing an IT mindset and moving from paper to an integrated infrastructure will be met with mixed emotions depending upon which group within the company you address. Finance will not see it the same as data managers or process and operations personnel. Of course Senior Management will always demand more and one way to do so is through consistent process applications and a strong founded infrastructure inclusive of electronic signature, repository storage, data back-up and recovery and disaster recover plans. Each of these comes with a price but the performance far outweighs the paper systems and human interaction to maintain said systems.
K.R. Karu, Sparta Systems Inc.: The fears of expense and regulatory scrutiny are unfounded and the opposite is actually the case. Managing processes and the related data from auditing, deviation management, CAPAs, lab investigations, change management and complaints in a harmonized global system creates efficiencies and opportunities for savings rather than an expense. Instead of managing disparate processes in siloed systems or paper records, these process records are easily shared and visible to all who are involved in the process.
Root causes of issues and the corrective actions can be shared across other functional areas that may have the same conditions, and preventive measures for future problems can be addressed. Companies that cling to paper systems because of perceived cost savings and perceived protection from regulators eventually find that they are actually exposing their company to expensive manual processes that require excessive head count and create systemic quality issues that regulators can easily uncover with little effort. Once regulators find inconsistent adherence to SOPs, different methodology, and record quality between sites/functional areas will find themselves exposed to expensive regulatory actions.
Ken Rapp, Accelrys: The fear of change and expense associated with integrating to a paperless system is historic in the pharma space. However, transitioning to a paperless system is ultimately essential for companies to thrive — reducing total cost of ownership and time to value while increasing quality and compliance standards. The efficiency of the new paperless systems is unparalleled compared with the old fashioned paper-based systems, and eventually the ROI will be too great for companies to ignore.
Also, data can walk off your floor any time when it’s on paper, not to mention the fact that it’s hard to do trending on paper. In all practicality, it is impossible to do trending — let alone continuous improvement, or increased process understanding on paper.
Documentation is a big deal. Applied Group was born in the manufacturing space as a consultant, so documentation and change management processes are in our DNA. Part of the isuue is they’re afraid ... and this is true for all highly regulated industries — whether that’s gaming, health care, financial services or manufacturing. They’re afraid of what the regulators are going to say to them if they change. But if the process is documented, logical and thorough, it’s less risky. A lot of the recent 483s and other issues that manufacturers have been called out on are really because of a lack of rigor in their data and record keeping and their inability to back claims with high-quality process data.
Lane Hirning, Product Manager MasterControl: “Are the fears justified? Yes and no. The expense of moving to an electronic system is reality but the benefits are also well known. There are expenses around the purchase, configuration, validation, and training of users on a new system. There is also the fear that the company will need to keep both the legacy and electronic systems active for some period of time with no clear ‘cut-off’ due to lack of confidence in the new system.
I believe that this fear is based on the long deployment timelines for customized solutions of the past. The newer, truly ‘off-the-shelf’ software offerings and newer software validation strategies should overcome those fears. Software that is configured to the needs of the customer instead of customized is easier to implement, test, and validate. Configuration vs. customization decreases time to ‘go live’ and much of the ‘overlap’ of legacy and new electronic systems. Vendor executed Operational Qualification (OQ) and automated OQ testing decrease the time required for validation and allow managers to use risk-based decision making to only test the higher risk components and not waste time in lengthy validation efforts. I also believe that the current trends in IT virtualization will lead to easier, faster, and more trusted disaster recovery (DR) solutions which also bring about confidence in the new systems.
Cloud offerings are beginning to be considered and, even with Pharma’s overwhelming intellectual property and security concerns, some applications will be appropriate for the cloud and allow for faster deployment and greater access. Software offerings requiring extensive validation may not yet be ready for cloud deployment but that problem is being working on by a number of vendors. The bottom line is that, both from the vendor side and the greater acceptance of electronic solutions by regulatory agencies, the move to the electronic solutions is not an ‘if’ but a ‘when’ and many of those regulatory bodies will start dictating timelines. The time is now.”
PhM: Improvements in the Pharma industry tend to revolve around five key areas:
• Assuring product quality/patient safety
• Driving out cost
• Accelerating time to market
• Reducing risk
• Improving compliance
How can information technologies best be deployed to improve on these imperatives?
K.R. Karu, Sparta Systems Inc.: “Progressive companies that have implemented enterprise-wide harmonized systems for managing quality processes have found measurable improvements in all five areas listed above. When global processes and data capture are in place for auditing, issue investigation and resolution, CAPA management and change management; problems are discovered and remediated earlier in the manufacturing process which makes for a better and safer product released to market. The earlier a problem is discovered and resolved, the less it costs, which ultimately reduces all manufacturing costs and reduces shortages. If lessons learned in one area can be applied to many, this creates further efficiencies and helps produce more ‘first time right’ which reduces risk and accelerates time to market. Having global systems in place that are proven to produce these results always improves compliance with regulations.”
Ken Rapp, Accelrys: “We believe the key to improving these five areas is standardization. By employing the best practices of the industry — the ‘tried and true’ automated processes that have proved they provide value — the cost of ownership decreases dramatically while the time to value increases dramatically. We like to think of these automated solutions are to IT what Betty Crocker cake mix was to baking. Just add eggs and water, and you’re guaranteed an excellent cake every time without the fuss of the gathering and measuring various ingredients and going through tedious preparations. In the same way, the best practices of lab automation make experimentation more efficient, driving costs down, improving product quality, accelerating time to market, reducing risk and improving compliance.”
Dino Busalachi, Applied Group: “Concerns around product quality, cost reductions, improved profitability, risk reduction and compliance improvements — those frankly are what drive our conversations with our customers. Within the manufacturing side of our business, that is, our automation practice, we have a heavy focus on OEE [overall equipment effectiveness]. There’s the famous quote from management consultant and author Dr. H.J. Herrington, ‘Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you don’t understand it, you can’t control it. If you can’t control it, you can’t improve it.’ From our perspective, it’s all about OEE and it’s all about improvement and it’s all about real time contextual intelligence.
Trending is a big piece of OEE; for example, we can measure output from sensors on specific devices in a manufacturing operation, and look at how that particular sensor performs for given a period of time, enabling preventative maintenance on a cell line. But the possibilities are limitless. People can do a lot of things with information that leads to improved quality and operational efficiency — decision support in real time. You can’t pursue your business goals if you don’t manage your assets effectively and understand what’s happening in real-time. With access to real-time manufacturing/operational intelligence, you gain the ability to manage forward, versus managing from a rear-view mirror.”
Trish Meek, Thermo Fisher Scientific: “Because LIMS are tightly integrated with other enterprise operation systems such as ERP, insights from the lab have the potential to be even more central to businesses seeking true enterprise-wide agility. Businesses aren’t simply capturing and collecting data; they are making data actionable across the enterprise, putting management in the position to transform their businesses into agile organizations capable of responding quickly to market trends or new regulations and flexible enough to recognize and capitalize on cost-saving or margin-growing opportunities in the future.
Today’s LIMS is far more than just a lab information system. It is also a laboratory resource planning system. And while the concept of a truly ‘paperless lab’ has been hotly debated for many years, it is really only now coming into its own. The capabilities of the latest Informatics solutions are capable of fully integrating the laboratory, eliminating for the most part many of the paper-based processes that have caused bottlenecks in workflow or contributed to errors in transcribing results and generating reports.
In addition to taking advantage of the latest LIMS functionality, we also find that integration and data visualization are key components in our customers’ paperless lab strategies. Data visualization takes integration to the next level. The ability to see not just the final result number, but the actual chromatographic or spectral data gives scientists the ability to quickly identify if a result is a bad sample or a bad run. This enables them to quickly escalate problems with a batch or an environmental contaminant or retest the sample as a priority sample and approve the batch.”
PhM: What, in your opinion, are the key elements of an effective data/informatics infrastructure?
Ken Rapp, Accelrys: “For a data/informatics infrastructure to be effective, it must be flexible enough to meet the needs of a very dynamic customer environment. Customers should have the ability to use barcoding, connected computing or mobile technologies. Everything should be accessible whether on the premise or in the cloud, and it should have the ability to migrate from what is currently in place today to a new environment tomorrow. It’s also essential for customers to have the ability to work with their partners, and this partner ecosystem is constantly changing and evolving, so the infrastructure must also have the ability and flexibility to change too.
Another key component is standardization. In addition to the benefits mentioned above, standardization accommodates the flexibility use personnel. With personnel shifting from department to department frequently, the ability to provide them consistent tools regardless of their current setting is essential to ensuring data informatics solutions are effective.”
Forrest Rudnick, Datalynx-U.S.: “Key obstacles for achieving an effective data/informatics infrastructure facing pharmaceutical manufacturers today include an unstable economy, rising costs and fierce competition. Constant pressure from leadership to improve operational costs while embracing quality and long term output, cost reduction and quality/productivity improvement is always the fore front of executive meetings. With growing Global competitiveness undue pressure on cost, quality and customer demands, people cannot measure, monitor, and control performance for all personnel during the process time. It is important to track and leverage information technologies/systems and automation and satisfy management.
Measures can be taken to meet all the immediate needs through automation of monitoring inputs and outputs, trend analysis reports and by making simple processes repeatable. Review documentation each month to ensure adherence or improvement. Invoke an incentive program to catch problems early. This accomplishes two areas of concern, gets your employees involved with solution management and helps with reduced operating costs. Every company is good at putting together task forces to identify the cost problems, but no companies put together a task force to close the cost gaps identified.”
K.R. Karu, Sparta Systems Inc.: “The first step is to identify your core strategic systems for data, including ERP as the foundational system, and then Enterprise Quality Management Software (EQMS), LIMS, Document Management, CRM and any others that may be necessary to run your business. Then define which systems are responsible for which processes, and make sure systems can share data and hand off results to each other. Embrace technology advances: for example, using i-Pads during audit processes and mobile devices to report events as they are discovered.
Finally, collecting data is not enough. Identify how data is analyzed. Having an analytics tool to slice and dice the data for each individual business area allows for discovery of trends and better and quicker fact based decision making.”
PhM: What do you think are the consequences to the Pharma industry if it fails to effectively leverage information technologies to tackle industry challenges?
Forrest Rudnick, Datalynx-U.S.: “A key area of misunderstanding [stems from the fact that] many of today’s executives did not come up through the ranks of the blue collar environment so they do not grasp the far reaching effects and impact to all the processes and systems involved when demanding greater output or higher efficiency at a lower cost. The FDA Warning Letters and notification of violations repeatedly identifies the same issues regardless of what facility is inspected primarily due to humans not performing consistently as machines. Where a machine can run 24 x 7, be shut down for preventive maintenance or calibration and then be up and running again with consistent output, people cannot. Even on their best day, humans will always introduce errors into the process.
IT automation has the ability and capacity to out-perform a human in so many areas to make those FDA 483s become a thing of the past. Without the intervention of a mechanism that has repeatability with consistency, the FDA will continue to find the same problems over and over. Reducing time to market, lowering costs and providing a consistent product will remain a pipedream unless IT automation is introduced into the process. IT automation still requires human interaction, continual training, mentoring, and adapting to new technology. Without this progression, elements that have haunted the pharma industry will live on indefinitely.”
Ken Rapp, Accelrys: “I don’t think it possible for the pharma industry to fail here, and it is [our] mission to ensure that they don’t. There are already too many pharma companies on the path of effectively leveraging information technologies and succeeding for the entire industry to turn back. While some companies may not adopt these technologies, they will be the ones falling behind; not the industry. We’re at the tipping point with lab automation — the more companies that successfully implement the systems, the more others will need to follow suit.”
Lane Hirning, MasterControl: “As with submissions, safety reporting, and other regulatory responsibilities, the agencies will realize the efficiencies of electronic processes and begin to mandate those to the industry. This will force changes and catch a number of companies unprepared. That will cost them time and money as a result of delayed submissions, bringing in contractors or consultants to augment the workforce while the new systems are installed, tested, and validated. In the interim, companies will be falling short of what is expected by agencies since they will be adjusting timelines to match efficiencies gained by electronic processes while those firms remaining in the paper/manual world will not be able to keep up – requesting extensions, missing deadlines, and receiving warnings during audits.” (See White Paper: “Six Corporate Oversights of Quality and Compliance Issues in Pharmaceutical Environments:” bit.ly/12jUvLr.)
Trish Meek, Thermo Fisher Scientific: “To push the boundaries of innovation, companies across the life sciences spectrum must assiduously monitor performance and quality and be ready to capitalize on opportunities to transform and grow. The good news is that many established pharmaceutical companies have spent more than two decades methodically adding technology in preparation for these challenges. But all this investment could be for naught unless these companies take deliberate and strategic steps to align non-integrated, often disparate resources in ways that enable maximum agility for their businesses. And for a lab of any size, new technologies go well beyond instrument advancements alone; cloud and mobile computing, for example, are driving major changes that not only affect business velocity, but also lower entry barriers to increasing competition. In this way, technology is an equal-opportunity catalyst that puts even more pressure on CIOs to stay ahead.
The real risk to today’s pharmaceutical companies is that their competitors will discover how to more effectively leverage information technology. There is great focus in the industry on improving pipelines and optimizing manufacturing operations. Our customers are working with us to determine how to leverage their existing investments. Companies that aren’t thinking this way may fall behind in the market.”
K.R. Karu, Sparta Systems Inc.: “Companies that fail to leverage the information they can receive using technology are destined to have limited or declining growth. Being nimble and making decisions based on all of the facts can help assure a company will provide the market with safe and effective products that are trusted by doctors and patients, while creating manufacturing efficiencies that will allow faster time to market in a highly compliant manner.”
Published in the July 2013 edition of Pharmaceutical Manufacturing magazine