March 2024 • PharmaTimes Magazine • 34-35

// DATA //


The patent cliff, advanced modalities and an R&D IT reset

Big reset

Image

With the pending patent cliff that puts tens of billions of dollars in sales at loss of exclusivity risk between now and 2030, biotech companies are feeling the pressure to find the next blockbuster.

New advanced modalities are a big focus. Think precision medicines – differentiated products with better patient outcomes set companies up to scale the patent cliff.

Antibody-drug conjugates (ADC) are a prime example, offering a less-toxic cancer fighting treatment than chemotherapy and popular with regulatory approval, ADCs are a promising path for pharma growth.

The shift to new modalities is a strategic inflection point for biopharma. Having the right foundation in place for R&D teams to be operationally successful is key.

The tools designed fifteen years ago were created for the types of therapeutics that were popular at the time – monoclonal antibodies and small molecules.

Meanwhile the current paradigm includes therapeutics like ADCs and modified oligonucleotides that blur the boundaries between biology and chemistry.

Bringing advanced modalities from the lab to patients requires more data, complex multimodal data and greater need for collaboration across specialisations.

These new modalities upend the way work is done in the lab. You can’t simply apply legacy tools built for small molecule R&D.

Here are the areas where we’re seeing the most progress today, with companies evolving their tech strategy to capture the opportunity with advanced modalities.

Model performers

With advanced modalities, scientists require more flexible data models that support the molecular complexity as well as the rate of change in experimental techniques being used. Flexible data models also need to meet the business needs to obtain compliance-friendly data.

Closer collaboration between research and development teams has become an imperative – teams need greater visibility and access to key pieces of the underlying technology in order to bring products to market, this visibility is now mandatory early and throughout the R&D life cycle.

IT sees a solution in taking an end-to-end approach to data and teams, with one platform spanning research and development and manufacturing.

Moving away from siloed apps and towards a single interface, a central platform for scientists to capture and manage data across research, process development and into manufacturing is key – essentially following the molecule.

In recent years, there has been significant improvement in end-to-end platforms with adaptable data models for different teams.

These platforms provide user-friendly dashboards, high-throughput data application programming interfaces through ETL (extract, tranform, load) processes, configurable schemas and robust audit trails.

Connectivity more

The challenge of automating and standardising data capture from instruments has long plagued the industry.

Scientists and IT teams report that their organisations use more than 100 lab instruments, with nearly half (47%) indicating that less than three out of five of these instruments are connected to software that supports data capture.

With new modalities, labs are contending with rapidly growing, complex data sets. The challenge of instrument connectivity is front and centre.


‘As techniques have become more complex, the amount of
data generated has massively increased, and data has become biotech’s lifeblood’


Fortunately, progress is underway, thanks to a movement away from proprietary data formats and vendor lock-in, and towards open industry standards, open source tools and data integrations.

Earlier this year, the Allotrope Foundation achieved an important milestone, launching publicly available data standards for lab instruments using the Allotrope Simple Model (ASM).

Recently, Benchling built on this momentum with the launch of Connect, which automates instrument data capture and management using a unique open source approach, mapping all instrument output to the ASM and making the converter codes open source and freely available on GitHub.

The impact of this means that R&D organisations are now able to automate time-consuming, manual lab instrument data collection, and consolidate the management of experimental data and metadata all on one centralised platform. Security is also a key benefit with instrument connectivity.

With better automated and integrated systems, companies will move away from issues where data is scattered across individual USB drives or personal laptops.

Instead there will be a move to structured data, with systems that deliver GxP compliance and an audit trail.

Cloud city

Security, compliance and privacy have always been priorities for biotech, but in the last few years, treating data as a strategic asset has been more crucial than ever.

As techniques and processes have become more complex, the amount of data generated has massively increased, and data has become biotech’s lifeblood.

Biotech knows that managing security risks appropriately requires engineering, automation, threat intelligence and other tooling. Fortunately, the industry is getting more secure outcomes by taking advantage of the economies of scale with cloud platforms and their offerings on cybersecurity, resilience and disaster response.

Tech cloud providers have a duty and incentive to be secure – they invest far more in security than most companies can afford to on their own, and also have an abundance of expertise.

More times than not, maintaining an on-prem strategy exposes biotech to more risk because 100% of the security responsibility and resourcing is on you, the company.

The biotech industry is getting more secure outcomes by taking advantage of the economies of scale with cloud platforms. This is a clear win for R&D IT, as it moves away from legacy, on-prem systems that don’t offer the same level of connectivity, productivity and security.

Spectre of AI

R&D scientists are eager to incorporate key technologies such as AI and ML into their work, to make predictions and overall outcomes more efficient.

Though as much as AI will drive transformational change in how we discover new drugs, one lesser discussed, but impactful benefit of AI on biotech is that it requires the industry to prioritise strong data foundations that are organised and well structured.

For IT, AI is a forcing function to operationalise the flow of data coming from experimental pipelines, and to integrate previously siloed data sets end to end.

Biotech companies need to build their data strategy and systems before benefiting from AI and ML. Beyond collecting data, doing so in a standardised way and anticipating how the data should be consumed, is key.

The industry is racing towards inventing and delivering better medicines, faster and it needs to develop a new tech strategy in parallel.

To navigate this new normal, IT teams will play an increasingly pivotal role in ushering new data management strategies. With the use of cohesive data strategies and modern tech, biotech companies can set themselves up for success for the next two decades of discovery.

IT teams and companies that can align their teams and rapidly incorporate new tech stand to be the most productive biopharma companies, driving success for decades to come.


Vega Shah is Senior Product Marketing Manager at Benchling.
Go to benchling.com