June 2022 • PharmaTimes Magazine • 20-21

// CLEAN DATA //


Image

Drive time

Reliance on data in the life sciences is exponentially increasing – here are the essential points to start driving your systems

In future, a whole range of interconnected business functions – including external entities along the supply chain and regulatory processes – should be able to rely on a comprehensive, approved master data set to support a whole range of actions. Those could be submissions and compliance related, or support internal analytics and management reporting.

Without complete confidence in a defined quality, integrity, currency and completeness of data, however, there are two main risks to the business functions. One is that they will fall foul of the authorities, providing inaccurate or out-of-date information, and as a consequence incur delays to market.

The other is that everything they do from this point will be founded on false assumptions, so that problems are perpetuated and risks are magnified as documents are built from data, data is extracted from documents, and as changes and updates are introduced to some or all of that content.


‘Data transformation, governance and maintenance should be a continuous discipline rather than a one-off effort’


On top of the inevitable delays and additional expense incurred by reactive damage limitation, companies entering into a remedial data preparation exercise are likely to be investing those new resources in a finite project with a single target use case. This is directly at odds with the higher and more strategic aim of creating a more data-driven business.

Regulatory information is a great starting point for proactive data transformation, because it serves so many other functions and use cases beyond its own.  Transformation of regulatory information data quality, warrants senior buy-in – since all commercial and reputational outcomes stem from this.

From manufacturing to quality to pharmacovigilance, accurate and consistent regulatory information has an integral role to play in maintaining standards, identifying anomalies, managing changes, keeping records, upholding patient safety and demonstrating continuous compliance. There are some critical considerations around regulatory data quality, however.

1. Start improving data quality now

First, any process and any supporting IT system is only as good as the data it has access to and what it is able to reliably do with it. Irrespective of how prestigious the change management consulting firm and however modern the chosen new IT system, unless there is defined, meaningful, dependable and up-to-date content, any investment in these powerful enablers will be compromised.

Improving the quality and usability of data is something that needs to start now, outside of any specific departmental system project. And it is something that will take a long time, and which needs to happen continuously. For this reason, it’s important to assess and scope the work involved, which will involve prioritising target data and critical use cases, so that transformation can happen in a staged way towards a long-term goal.

2. Set a realistic scope for the data quality project

Secondly, being realistic and setting expectations will be important in ensuring that the quality, richness and overall value of regulatory data grows and benefits the organisation as a whole. Not all life sciences companies have the resources needed to overhaul all of their data in one go, and data transformation, governance and maintenance should be a continuous discipline rather than a one-off effort.

At some point the responsibility for data monitoring and upkeep needs to become part of everyone’s day-to-day remit and routine.

3. Choose high impact data

It will be important to set some parameters and decide which data takes priority. Logically, it makes sense to target improvements geographically – starting with strategically important target markets and then data with the greatest impact. For instance, sets that are most likely to be consumed by others, outside of the regulatory operation.
  
This data is likely to be:

  • Details of existing product registrations and their status
  • Details of where products are registered
  • The latest documents and information used in these registrations
  • Information about safety-related changes and details.

Work on operational data, such as who made what changes as well as any data linked to the company’s internal planning and tracking objectives, can come later. Details of interactions with the various external authorities might fall within this tranche of data improvement/enrichment.

4. Use standards to support data quality work

From the EU’s evolving implementation of ISO IDMP to the FDA’s PQ/CMC requirements in the US, and equivalents in Asia and beyond, it makes sense to use upcoming regulations as an additional driver of data preparations. This will both enhance the business case and ensure that work doesn’t have to be repeated if expected data fields or formats haven’t been taken into consideration during the ‘groundwork’ phase.
  
Most of the emerging standards are geared to driving up quality and enabling more efficient, data-driven processes between marketing authorisation applicants/holders, agencies and patients, so using the predefined scope of new standards should put companies on the right path and save duplication of effort later.

Although IDMP is currently mainly a European endeavour, other markets do have plans to adopt similar parameters and IDMP benefits from supporting cross-functional use cases – not just pure regulatory applications.

5. Put people first

Appointing cross-functional teams will help ensure that all users of regulatory data within the organisation have a voice and can play a part in determining information’s relative value and various use cases. This will help pinpoint which data is most critical and needs the most work, and how that data is currently collated, stored, maintained and used across processes and departments.

As with any initiative of strategic importance, transformation needs to be desired, championed and led from the top of the organisation. Better data and clearer process visibility needs to become part of the DNA of the business, which demands a shift in mindset.
  
Then a whole framework will need to be agreed, backed up by standard operating procedures and policies for everyone to follow.

In an ideal scenario, certainly if the company is big enough, dedicated data managers should be appointed or hired, to oversee the data transformation journey and to act as supervisors to execute the new framework.
  
There are two qualifications to this, however. First, everyone will need to take a degree of responsibility for adhering to new data management guidelines and maintaining ongoing data quality. Second, there is plenty of help on the outside as already overstretched teams strive to cope with all of this alongside their day jobs.

Tap into technology

Until companies get a handle on data quality, they will continue to be caught out by audits, missed registrations, or project disruption triggered by the discovery of poor-quality data. Effective technology can play a key role in addressing data quality issues, offering flexibility and granularity to accommodate current and future data requirements and driving good data practice.

Taking the first step towards improved data quality has never been more important. Data that is complete, up to date and accurate, is vital to power data-driven processes that will speed up regulatory processes and boost business outcomes.


Agnes Cwienczek is Head of Product Management and Consulting at Amplexor. Go to amplexor.com