Streamlining workflows from development to commercialization

Nov. 8, 2023
Digitalization with advanced analytics empowers pharma experts to speed lifesaving therapies to market

With patient populations expanding and supply chains in flux, pharmaceutical organizations are feeling the pressure to bring drugs to market as quickly and cost-effectively as possible. Adding to the crunch, these ambitious tasks must be accomplished while maintaining drug safety and efficacy. The only way companies can rise to the occasion and stay organized is by embracing digitalization in R&D, manufacturing and regulatory settings.

In this high stakes and highly regulated industry, therapeutic efficacy and integrity depend on efficient and data-based decisions. For this and other reasons, pharma organizations are improving and operationalizing workflows throughout the entirety of drugs’ life cycles by implementing advanced analytics, which enables seamless automation of equipment health monitoring, calculation standardization, scale-out and multi-phase monitoring.

From formulation development through commercial production, drugmakers require analytics that can adapt to the changing needs of the industry. Flexible analytics platforms that can handle discrete, infrequent lab data at the R&D scale must evolve with manufactured products to gather data across hundreds of reactors at the production scale. Ultimately, empowering engineers to streamline common workflows across each phase of the drug product life cycle will allow for cost reduction and time improvements.

Process monitoring with advanced analytics

In drug development and production settings, a variety of processes are often at play. With batch, semi-batch and continuous processes all commonly used throughout the pharma industry, organizations require advanced analytics resources that can complement a variety of workflows.

These multiple process types are never more evident than in the world of small molecules, where fluid bed granulation, encapsulation and spray drying — among others — require dynamic analytics tools.

In a granulation case study shared by a top-five global pharma company, engineers were having trouble monitoring continuous and dynamic processes with conventional methods. However, leveraging an advanced analytics solution, the team incorporated key input parameters — such as roller gap, press force and roller speed — to model granulation performance of calculated parameters, including solid fraction (SF), SF error and gap error. 

With a near real-time performance model in place, the team predicted deviations from process setpoints proactively, instead of relying on reactive measures. In one specific instance, engineers identified SF and gap irregularities occurring mid-manufacture, and as a result, they stopped processing immediately to proactively troubleshoot (Exhibit 1).

With continuously accessible near-real time data available, the models revealed refined setpoints and process improvements that can be fed into future campaigns. The team also noted reduced startup times and material savings during the granulation processes. With a reliable data-informed view into process variability, the company inadvertently improved quality assurance as well.

While the company’s advanced analytics resources and digitalization initiatives helped improve its continuous manufacturing workflows, improvements did not end there. In another case study, the same team explored how to leverage advanced analytics solutions to understand process dynamics within a semi-batch wet granulation process.

With many cycles and unit operations in each batch, the task of gathering and parsing through the vast quantities of data was previously cumbersome and time-consuming. Driven by digitalization initiatives, the organization turned again to advanced analytics resources to standardize their analytics workflow for interpreting large datasets. 

With dryer cell temperature, status, inlet temperature and humidity, and airflow inputs, the team developed an understanding of average batch values to gain insights into batch performance and variation across dryer cells. In one instance, the calculated averages indicated variant behavior in a specific dryer cell that led the team to optimize the drying parameters, including drying time and temperature difference. 

Overall, this digitalization initiative created material savings and mitigated the need to reject material in future batches. With a streamlined analysis, the team automated the development of wet granulation monitoring views for sharing key insights across functional teams, and the team noted an improvement in product uniformity because of the better monitoring and data-based control strategy (Exhibit 2). 

Near-real time monitoring and alerts

When subject matter experts (SMEs) are faced with competing priorities in addition to delayed and infrequent data reporting, a struggle to respond to process issues in a timely manner almost always ensues. Without even realizing it, organizations often waste precious time and resources by not leveraging the right analytics software solutions for the tasks at hand.

At a top-15 global pharma company, overworked engineers sometimes found themselves unable to respond to process excursions for weeks and months at a time. However, by automating process alerts using an advanced analytics solution, the organization reduced reaction time to hours, driving a production increase of more than 500,000 units annually.

With competing priorities impacting not just engineers but operations teams as well, advanced analytics platforms can be leveraged on the manufacturing floor for near-real time insights.

In another case study shared by the same organization, engineers developed real-time statistical process control charts to compare current hourly performance with past months of performance. By consolidating summary metrics in a dashboard accessible on the manufacturing floor, operations teams gained visibility into hundreds of potential deviations to focus on the highest priority variances.

The engineering team saw a subsequent increase in informed recommendations made by their newly empowered operations colleagues. By placing the power of data-informed decision-making in both engineers’ and operations’ hands, the organization noted productivity improvements and an increasingly collaborative culture.

Improved user experience with data infrastructures

It does not take much imagination to picture examples of advanced analytics tools helping improve batch quality, reduce downtime and support continued process verification efforts. However, digitalization efforts should not be constrained only to improving analytics workflows.

In a case study shared by a U.S. pharma company, an in-house team of engineers expressed difficulties onboarding new employees to their existing data infrastructure with complicated equipment tag names and non-intuitive asset structures, which made accessing data burdensome. Similarly, the team experienced challenges onboarding new equipment to their facilities because each piece of equipment was often incorporated into the data framework in a completely new and unique way. Faced with mounting complexities from a variety of users and onboarding methods, the team decided to start from scratch and design a new data infrastructure altogether.

The engineers began by aligning on a series of goals: empower users to find equipment and tags easily, standardize calculations and visuals for similar equipment and assets across sites, and automate deployment of the revamped system for new equipment. Leveraging an advanced analytics platform with integrated programming capabilities, the data science leaders on the team developed user-oriented asset structures to mimic data infrastructure based on how employees visualize the plant resources.

After creating standardized, user-friendly tag names, the team moved to incorporating calculations into the asset structures. This process enabled not only calculation and visuals standardization, but more importantly, facilitated easy access to these standards across equipment and sites. With positive feedback from the broader team, the engineers enabled automatic deployment for future equipment by programmatically defining and deploying asset structures. Not only did existing data and advanced analytics users appreciate the transition, but the number of users accessing the newly available data doubled.

This case study exemplifies how accessibility barriers can prevent key stakeholders from interacting with data that is critical to their functions. With digitalization efforts driving user-base improvements, the process data was put to broader use. The engineering team in this case took to heart a bold refrain first published in 2017, although it is hardly a surprising truth in today’s age of digital transformation: “The world’s most valuable resource is not oil, but data.”1

Increasing process efficiency

Pharma organizations face ever-mounting stress to manufacture drugs more rapidly, efficiently and cost-effectively, but in today’s modern manufacturing environment, achieving such tall orders requires embracing cutting-edge digitalization solutions.

By leveraging advanced analytics software, users are discovering rapid insights from time series process data, with integrated lab and manufacturing data in a single location. SMEs from formulation development to commercialization functional teams are gaining the ability to optimize production and save time that would have otherwise been spent manually compiling and sifting through vast quantities of information.

With solutions made to evolve with processes at lab and commercial scales, including batch and continuous processes, organizations can devote digitalization resources, instead of precious engineering resources, to monitoring and alerting. And rather than waiting weeks — or even months — to notice process deviations, SMEs are now utilizing digitalization tools to identify developing issues and monitor process health in near-real time. These same resources streamline and automate the scale-out of facilities and equipment data accessibility, helping even brand-new process engineers get comfortable with the data they need quickly.

Batch-saving decisions on the plant floor are empowering companies to reduce waste and increase development speeds. And pharma organizations across the industry are leaning on digital resources to monitor process data throughout the drug life cycle, providing better insights for quality metrics and faster delivery of lifesaving drugs.

About the Author

Tatum O'Kennedy | Analytics Engineer, Seeq Corp.

After graduating from Northeastern University with a master’s in chemical engineering, Tatum worked in a variety of roles in the pharmaceutical industry, including formulation development of small molecules, product development of spray-dried dispersions and multiparticulates, and manufacturing science and technology engineering for drugs undergoing commercialization. In her role at Seeq, Tatum continues to combine her passion for the pharmaceutical industry with a love for analytics and works with pharmaceutical companies to generate the most value from their time series data.