How many times have you needed to investigate an important issue, or proactively design a new process, only to be delayed because the data wasn’t available? Does your organization often decide the effort of finding data, that you know exists somewhere, isn’t worth the time to pull it together? Does an FDA query make your knees weak?
Data is a tremendous asset to be captured, analyzed and made readily available — but this can be easier said than done. These data assets are extremely valuable and should be fully leveraged, which requires “yes” answers to the following questions:
• Do we know where historical data assets are located?
• Do we know how to access data being generated by CMOs or other partners?
• Can we quickly respond to client and/or FDA queries?
• Are we effective at accessing and analyzing our data?
• Are we working collaboratively across sites and teams?
• Do scientists play a leadership role in driving the data strategy?
• Does our strategy support our scientific needs?
If you’re like many leaders, at least some of these questions keep you up at night. Odds are you routinely discuss options, and even toy with technologies to provide answers.
The most common road blocks are complexity, scalability, IT concerns and costs. The good news is recent advances in computer-based technology — like enhanced cloud computing, faster calculation speeds, more database options and data analytics applications — can be brought to bear on these issues in these areas:
Historical Data Access: Access to historical data needs to be robust, particularly as organizational changes occur and impact knowledge retention. It is not uncommon to need access to significant amounts of historical data within very tight timelines.
Effective Data Access and Analysis: With the invent of high-throughput data acquisition, there is a need for high-speed data analytics and visualization. Yet most analytics analysis is still done with time-consuming spreadsheets, using data gathered manually from disparate sources.
Cross Site/Cross Group Collaboration: Therapies need to be quickly transitioned from a proof-of-concept phase to a scale and quality suitable for commercial production. Cross-group knowledge is paramount to achieve a successful launch, ensure patient safety and meet regulatory requirements.
Often decisions to address data access begin with an expensive, longer-term IT-focused solution, which can unfortunately put the scientist in the back seat. A better choice is to put your scientific leaders in control, along with a strong IT partnership, to achieve a two-fold advantage. First, your scientific leaders can get the tools they need to directly impact your pipeline today. Second, nobody knows better than your scientific experts what they want and need to develop your medicines, so this ensures their buy-in.
To get the most from your data, take a look at process data historians, increasingly useful sample management options and user-friendly analytics applications such as Seeq. These tools can simplify data capture, deliver traceability among samples and provide users with streamlined data analysis and reporting.
A data-centric company in the biotechnology space, Just Biotherapeutics in Seattle, is supporting the creation of a link between the LabKey Biologics product and the Seeq data analytics application to provide a complete, scalable, scientist-focused system for actively utilizing data assets across their development space. “Active data capture into structured systems tailored for the biopharmaceutical process is critical in our goal of accelerating development of biotherapeutics and substantially reducing their manufacturing cost to increase global access,” says Randal Ketchem, VP of Molecular Design at Just.
Rethink your data strategy, and drive it forward by putting your scientific community in the lead. Take immediate action to implement solutions leveraging existing investments, while providing flexibility for the future.