Navigating pharma's growing data integrity challenges
Throughout the life cycle of a single drug, an abundance of critical data will be produced. As the project progresses from design to preclinical, up to clinical and commercial levels, the wealth and criticality of this data will scale in tandem with the project. This increasing complexity has meant ensuring data integrity (DI) can be a significant challenge.
It is important for drug developers and manufacturers to understand the fundamentals of data integrity. Challenges in achieving and maintaining data integrity can often be attributed to a lack of critical thinking and the adoption of a ‘technology-first’ approach.
Instead of delivering data integrity with this mindset, the industry should focus on establishing a governance program that ensures both data integrity and security.
Keeping data integrity in mind, always
Drug developers and manufacturers understand that throughout a single project, there is the potential to generate an abundance of essential data. The amount of data produced will inherently and exponentially grow as the therapeutic journeys towards the market and consequently scales up to commercial levels.
Managing this wealth of data often relies on an array of computerized systems offering a range of benefits, from reducing human error to enhancing security and making processes more efficient. Of these, the most important advantage is the system’s critical role in implementation of technical controls to ensure data integrity.
Safeguarding DI — making sure data is complete, consistent, and accurate throughout its life cycle — is vital to data reliability and quality. Without DI, developers and manufacturers run the risk of using unreliable data in their decision-making, potentially risking product quality and patient safety. In a worst-case scenario, this could lead to safety risks, recalls, remediation costs, delays and denied approvals.
The FDA and other regulatory agencies have continuously updated and published guidance that emphasizes the need to maintain DI controls and for these to abide by good practice (GxP) processes since the first regulations mandated by the FDA in the early 1960s. In particular, the FDA and other regulatory agencies have highlighted that complete, consistent, and accurate data need to be ‘ALCOA’ — attributable, legible, contemporaneous, original and accurate. The introduction of the ALCOA+ principles placed further emphasis on ensuring data is complete, consistent, enduring and available.
However, understanding and implementing the regulatory requirements surrounding computerized systems continues to be a predominant challenge for many developers and manufacturers. This challenge is further compounded when considering the additional difficulties that come as a result of scaling operations and projects.
Scaling complex projects
As drug products progress and scale from preclinical phases to clinical phases, and finally, commercial, critical data needed to hit key milestones such as Investigational New Drug (IND) application and Biologics License Applications (BLAs) must be gathered. With greater and greater quantities of data collected at every stage — much of which will inevitably be scrutinized by regulatory bodies — ensuring data integrity is critical.
However, as innovative advancements and discoveries allow the pharma industry to produce treatments that are progressively more complicated and personalized, like cell and gene therapies, the complexity of the manufacturing processes involved will rise as well. As a result, successful manufacturing necessitates increasingly convoluted supply chains, specialized logistics and handling requirements.
With the rising intricacies related to development and manufacturing processes, the probability of error also grows. If the processes involved are manual, this can add to the likelihood of error. If not controlled, these mistakes can result in increased production costs along with reduced product quality patient safety.
M&A further complicates
Globalization, business expansion, as well as the need for increasingly intricate manufacturing processes for complex biologics continue to prompt mergers and acquisitions (M&As) across the industry. Each acquisition, however, is likely to bring disparate information systems and integration complexity, which can negatively impact data integrity controls across the expanding digital infrastructure. Growth through M&A may be a necessity, but with each acquisition comes its own established digital infrastructure and entrenched legacy systems. As a result, data integrity control deficiency or risk is a common occurrence in many pharma organizations.
The costs required to alleviate these issues and the complexity involved in implementing up-to-date systems have negatively impacted the efficiency of data-driven operations. This has been highlighted by a recent Pharma Manufacturing poll, which revealed that the pharma industry is still hesitant to accelerate the pace of adopting data-driven business models.
This inconsistent uptake of digitized solutions to improve efficiency while safeguarding data integrity is not sustainable and will require the adoption of industry-developed standards to overcome the issue. Unfortunately, many of the developers and manufacturers that are willing to implement new digitized solutions often rely on approaches that can cause problems down the road.
Avoid the ‘technology-first’ approach
There are many technologies available that can be used to record, manage, and store data, which, when implemented correctly and with robust DI controls in place, can protect against human error and simplify regulatory compliance. One example of this technology is a validated laboratory information management system (LIMS), which harmonizes and standardizes relevant workflows, procedures, and tests. As a result, these platforms can help enhance the efficiency and quality of data flow management.
However, applying these sorts of platforms has subsequently led to the implementation of more digitized solutions — converting hard-copy manual-entry documentation to a ‘paper-free’ form, for example. However, just because the process can be digitized doesn’t necessarily mean it should, as digitization does not always equal improved DI. Developers and manufacturers holding a ‘technology-first’ approach to operational problem-solving may not produce the process improvements being sought. If the process is poorly understood, defined, or not previously optimized, digitizing will likely not make it better.
Leapfrogging this step will mean that the developers and manufacturers may not truly understand the fundamental requirements of the process that they are planning to digitize.
This lack of a clear understanding and establishment of intended use is the cause of many challenges surrounding digitized solution adoption. This has been clear, with the number of observations made during inspections surrounding DI in documentation and record management practices in the last few years. This is highlighted by the fact that 75% of FDA manufacturing warning letters cited DI as an issue in 2018, rising from 28% in 2014.
A compliant approach to DI
Instead of delivering DI with a ‘technology-first’ mindset, drug developers and manufacturers should instead take time to establish a governance program that ensures data integrity and security. These should aim to prevent the challenges associated with scaling and increasingly complex therapeutics from impacting quality while improving efficiency.
Best practice to ensure data integrity and security organizationally is to build from the top down, with the foundations based on the application of critical thinking. Those in management roles will need to establish a robust program incorporating appropriate organizational support for both technical and procedural DI controls. Throughout this process, how all data is created, collected, reviewed, processed, archived, and destroyed will need to be assessed. Additionally, several key questions will need to be answered, including:
- What does “being compliant with the electronic record and electronic signature (ERES) related regulations of the FDA and EMA” mean?
- What data is critical and therefore needs to be protected and maintained?
- How can the organization ensure the validated status of a computerized system?
- How does the computerized system meet its intended use?
By answering these key questions only then can detailed procedures be defined and introduced with a solid understanding of why they are necessary.
Key lessons
As projects progress from pre-clinical to commercial, the amount of critical data grows as the project scales.
With the increasing complexity of treatments and intricacies of the relevant processes, there is an equally growing need for robust DI controls that allow accurate decision-making based on reliable data.
These challenges can be further compounded if digitized solutions brought in to elevate efficiency and protect data integrity are implemented without a clear understanding of the process they are trying to improve and how they could achieve this. It is therefore essential to implement a robust approach to data integrity whilst ensuring that regulatory requirements are met.