October 2023 • PharmaTimes Magazine • 16-17

// DATA //


Setting sights

Data integration in pharma is a prescription for improved efficiency and innovation

The pharmaceutical and healthcare industry has undergone a significant technological revolution in the last few years.

Traditionally, drug development and manufacturing were characterised as time-consuming, costly and fraught with uncertainty.

According to the Association of the British Pharmaceutical Industry (ABPI), on average it takes more than 12 years and £1 billion to develop a single new medicine and make it available to patients in the UK, accounting for inevitable delays.

The pandemic and recent technological advancements, however, have propelled the industry towards the adoption of new and innovative approaches.

Through drug discovery, accelerated by artificial intelligence, greater patient access to digital services, the adoption of electronic health records and blockchain technologies to support increasingly complex supply chains, novel technology is advancing the sector.

Alongside this technological revolution, sits the large amount of complex data that accompanies these advances. These vast and complex data sets, whilst highly useful throughout the development life cycle, pose significant challenges to the drug discovery process.

Fast forward

Put simply, data integration involves collating data from different sources into a single, unified data set. In terms of accelerating the drug discovery process, data integration allows pharmaceutical researchers to access large, amalgamated data sets that enable them to undergo multiple analyses for different purposes.

Whether it’s computer modelling, especially valuable for identifying promising drug candidates, performing precise tasks like drug molecular design and retrosynthetic analysis, or optimising the information flow between clinical trial and manufacturing units, these processes accelerate manufacturing timelines and create efficiencies.

With the industry facing increased competition, pushing companies to improve overall operational efficiency, it’s critical that organisations embrace data integration to stay competitive and streamline their processes.

Greater data integration between different R&D teams will be vital to enabling faster production of data outputs from a richer pool of data sources. By promoting greater collaboration, firms can eliminate workflow bottlenecks caused by siloed data and accelerate timelines from discovery through to market.

A call for the shift towards decentralised clinical trials and remote patient experience since the pandemic has created additional data complexities to address, and for clinical trials, this abundance of data can be especially problematic.

Due to these changes, researchers are now faced with diverse data streams that include real-time health metrics, patient-reported outcomes and even genomic information.

To improve the success rate of trials and expedite positive patient outcomes, it’s important for firms to incorporate regular data quality checks into their processes.

By implementing these measures, errors can be promptly identified and rectified.  Such risk-based quality management serves to accelerate decision-making and provide greater accuracy in research efforts to improve the chances of trial success.

Ultimate navigation

Ensuring regulatory compliance is a key consideration for companies in the health and pharmaceutical sectors, especially due to the stringent regulatory standards and the critical nature of their services.

It’s not uncommon for companies in the pharmaceutical sphere to face serious fines related to a lack of data protection. In 2022, a software solution provider in France was fined €1.5 million for a data breach involving nearly 500,000 people.


‘Data integration is a key enabler for greater accuracy in drug development and patient safety’


Data integration is a key enabler for greater accuracy in drug development and patient safety. It achieves this by guaranteeing the precision and uniformity of data, establishing traceability, and offering a comprehensive data overview.

Ensuring that this data reaches the appropriate individuals is equally important. The pharmaceutical field encompasses diverse skill sets and personnel, making it essential to establish processes and frameworks that not only protect data from a security perspective, but allow it to be accessed by the right people and for the right purposes.

By utilising advanced platforms that can consolidate study data from various systems and sources, including the increasing number of electronic health records and wearable health tracking devices that have resulted from increased trial decentralisation, data scientists can create more robust data sets.

Platforms such as Domino provide a fast, cost-effective, unified AI platform that plays a pivotal role in bridging the gap between specialised IT systems and biomedical data. This, in turn, results in greater accuracy of insights sourced and developed from the data.

That said, integrating data from various sources still requires a comprehensive framework, especially to guarantee this insight accuracy. On top of using specialist data-gathering platforms, frameworks should be built around secure data-sharing protocols and interoperability standards.

By doing this proactively and risk assessing prior to implementation, pharmaceutical firms can better support the capacity of researchers to forecast potential issues, therefore reinforcing patient safety.

Maximum output

In such a fast-paced industry as pharmaceuticals, continuously pushing the boundaries of what’s possible with data is necessary to remain competitive. With integrated data comes greater data agility, and across a complex chain this will help to prevent knowledge gaps that impede strategic decision-making and the competitive climate.

As companies constantly seek this growth in their data systems, we are experiencing an exciting phase for the pharmaceutical industry with a high degree of promising innovation.

While the sector might currently be lacking structure in this space, we can expect upcoming frameworks to help support the full data life cycle – from development to post-market monitoring.

The introduction of ICH E6(R3), although still in the drafting phases, represents a pivotal shift in the credibility and integrity of the data generated in this new clinical trial landscape.

With the guidance being optional and non-mandatory, it’s important for firms to be aware of emerging regulatory standards, especially for compliance and data integrity.  Primarily, this innovation was driven by the limitations imposed by its predecessor, R2, on the adoption of novel trial designs and emerging technologies.

The updated guideline incorporates essential quality factors to prevent data errors that could harm trial results, while also assisting in randomisation, blinding and retention.

R3 also ties intorisk-based quality monitoring (RBQM), the full potential of which will be unlocked by the technical advances and novel trial designs encouraged by the guidelines. RBQM involves the identification, evaluation and management of potential risks that could impact trial participant safety, threats to data integrity and regulatory compliance.

Firms should embrace RBQM as a measure to assess and mitigate the risks inherent to clinical trials.

In simple terms, greater quality management lowers the likelihood of regulatory issues and accelerates development timelines by minimising the time needed to address common issues found in studies without a quality-by-design approach.

For post-market, consolidated data sets allow for real-time monitoring and support pharmacovigilance professionals in detecting potential safety signals through ongoing trend monitoring.

The adoption of RBQM alongside advanced technologies, like automation tools, will be crucial. Investing in automation tools and technologies to streamline processes and reduce manual errors in data handling can reduce the burden on patients and maintain a healthy study pipeline in compliance with regulatory standards.

Final analysis

Greater efficiency, compliance and the leveraging of emerging technologies are all areas firms are looking to improve but integrating the variety of organisational skill sets required for this can be challenging for effective data integration. Creating a cross-functional work environment that encourages greater collaboration can help to support this by harmonising all the diverse expertise available.

To drive innovation, organisations must continue to adopt new technologies and methodologies, not just to meet regulatory requirements, but also to compete in a demanding market.

Effective data integration is essential for managing the wealth of available patient information. Above all, it ensures that different sources can work in synchrony to provide insights for innovation, tailored patient care and regulatory compliance, while also preserving data integrity.


Tabitha Sleap, Management Consultant, Life Sciences at BIP UK.
Go to bip-group.com