Analytical Solutions and Challenges for Medicines Manufacturing Towards Industry 4.0 - Report
Posted on 11/06/2020
You can view the full report here.
Weighing molecules with light; computer-aided biology; virus lasers and mass spectrometry for the masses: these are just some of the latest techniques and approaches designed to help the medicines manufacturing sector take advantage of the fourth industrial revolution.
Delegates at KTN’s Medicines Manufacturing Challenge Community (MMCC) event discussed a range of academic and industry innovations that can help the biopharma sector improve its manufacturing processes. Examples came from work on both small and large molecules, and explored what could be learned from each. The meeting – held in February- was organised jointly with the Centre for Process Innovation (CPI), and held at its National Biologics Manufacturing Centre in Darlington.
The MMCC initiative is designed to develop the medicines manufacturing community – both by stimulating the uptake of the innovative outputs from Industrial Strategy Challenge Funding, and by supporting consortium building.
The industry has access to more data, and more complex data sets than ever before, thanks to advances in sensor and data analysis technologies. However, its scientists are often unable to acquire knowledge directly from measurements of a pharmaceutical molecule and instead have to make inferential measurements – often product by product. Ultimately that affects the reproducibility of experiments and products, and drives a need for new sensors and analytics.
“analytics are becoming a barrier to advanced biopharma development”
Allan Watkinson, director of biopharmaceutical development at Covance, says analytics are becoming a barrier to advanced biopharma development and manufacturing. Expensive cell and gene therapies are not only complex to engineer and manufacture, but often have a short shelf life, so industry wants to see the vision of real-time release testing become more of a reality. “It’s pointless having something which you need to release immediately and then have to do 21 days plus for sterility testing,” suggests Watkinson. A cross-industry workshop run by the BIA (UK BioIndustry Association) has explored ideas to develop future analytics. It concluded that the key to moving forward with analytics was to “really understand what your CQAs [critical quality attributes] are, then you can say what aspects of existing technologies need to be improved”, and where new analytics technologies need to be developed. There is, Watkinson added, a need for “solid standards”.
New approaches to process monitoring
Complex APIs (active pharmaceutical ingredients) and biologics present stability challenges that have prompted the growth of freeze-drying for final product formulation. Geoff Smith, Professor of Pharmaceutical Process Analytical Technology at De Montfort University presented two new approaches to understanding the freeze-drying cycle, which would allow industry to capture process information. At present “you don’t know if it’s the process or the formulation [or both] that ultimately is stabilising the protein,” he explains.
Alison Nordon, in the Department of Pure and Applied Chemistry at Strathclyde University, has applied optical spectroscopic techniques to monitor a range of pharmaceutical processes from API synthesis through to the final dosage form. In one example, near infrared spectroscopy was used to monitor the blending of an API, where the mixing end point is usually based on experience. The researchers on the team deployed the technique at GlaxoSmithKline, allowing the company to get continuous verification and an 80% reduction in blend time – saving both energy and costs.
Chris Spencer, a scientist at AstraZeneca’s cell culture and fermentation science team described the work he and colleagues carried out with University College London, to assess commercially available Process Analytical Technology (PAT) that would potentially allow monitoring of key process parameters in bioreactor systems. No available system fully met their requirements – amply demonstrating the sort of gaps the industry hopes new funding can close.
Managing data is itself a challenge – prompting new systems for data import and structuring, as well as a plethora of machine learning tools to build process models. Matthew McEwan, who leads the Applications Engineering Team for formulated products at Perceptive Engineering, said machine learning required the same attention to the fundamentals as any other type of modelling: “Are we collecting the right data? Is it sufficiently rich? Have I got the right model? And am I sufficiently validating that? I think a lot of the pitfalls we can potentially avoid by being rigorous with those three steps.” But he urged delegates not to assume machine learning is always the answer. Where simpler linear models work, they’re “so much more supportable and understandable in the long term”. McEwan suggests the pharma sector can also learn from other industries: “we have not really mined the best of industry 3.0 – before we even start on 4.0.”
“not a data science problem, but a data engineering problem”
Opvia and Synthace have both developed platforms for structuring and analysing data. “Five to ten years ago it was super-easy to do everything in spreadsheets .….. but now you’ve got gigabytes of data, I’d argue Excel is no longer the right tool for the job,” says Opvia co-founder Will Moss. Having more data with increasingly disparate sources creates its own problems – not least, of reproducibility. It also requires new skills. Getting “all your data together with the context, in a clean final format – which data scientists can do things with – that’s not a data science problem, that’s a data engineering problem,” he asserts.
Synthace makes software to programme both lab and data automation. Markus Gershater, Chief Scientific Officer and co-founder describes their concept as “computer-aided biology”. AI, he says will help narrow the design space, “but that still means you then need to run more experiments.” Alongside others, Marc-Olivier Baradez at the Cell and Gene Therapy Catapult used Synthace’s Antha platform to capture and structure all its datasets, to allow analysis and model building as it tested PAT strategy for cell and gene therapy manufacture. The sheer complexity of these molecules mean that producing them with a high level of consistency is a huge challenge. “You need to understand what to measure; what it means in terms of the physiological and biological activity of the cells; and how to do these measurements in real time – either for process control, or in the context of DoE [Design of Experiments] for process optimisation,” says Baradez.
In another project, Lewis Wharram, a development scientist at CPI, described the work he and colleagues had done on the BioStreamline project to develop decision making tools for monoclonal antibodies. Two hundred candidates were characterised and ranked to determine which would have the best chance of developability.
The BIA wants to see promising data and process analytics approaches developed so that digitally enabled manufacturing moves a step closer. Discussion is underway to establish a funding platform so ideas can be developed.
You can view the full report here.