Share

By Volker Eck, PhD, PDA Europe

Does this sound familiar to you? “You have not established and documented the accuracy, sensitivity, specificity, and reproducibility of test methods as required by 21 CFR § 211.165(e).”

“The test methods performed for XXX USP have not been verified to ensure suitability under actual conditions of use… Specifically, you have failed to conduct adequate verification of USP compendial test methods as applied to the production of your firm’s XXX.”

“Furthermore, our investigators found that numerous products were tested using analytical methods, provided by outside sources, which had not been validated / verified according to SOP XXX and SOP XXX to determine these methods’ suitability for their intended use.”

“Method validation documentation did not include appropriate data to verify that the analytical method produced accurate and reliable data.”

Analytical method / procedure validation insufficiencies still are a frequent observation by inspectors. This is getting more complex if you rethink analytical method / procedure validation under the auspices of quality by design (QbD). Terminology is the least of the problems, but no doubt there is confusion out about what is meant by analytical procedure, as described in the ICH Q2 documentation and analytical methods.

In an article by Mark Schweitzer (Abbott Laboratories) , Matthias Pohl (Novartis), Melissa Hanna-Brown (Pfizer), Phil Nethercote and Phil Borman (both GlaxoSmithKline), Gordon Hansen (Boehringer-Ingelheim), Kevin Smith (Cephalon) and Jaqueline Larew (Eli Lilly) the authors discuss implications and opportunities of applying QbD principles to analytical measurements. This article encourages people to improve robustness in and apply continuous improvement concepts to analytical methods. The claim is “that the steps, tools, and approaches developed for application of QbD to manufacturing processes (and described in ICH Q8, Q9, and Q10) have analogous application to the development and use of analytical methods”.

In analogy to the quality target product profile (QTPP) that leads to defining critical quality attributes (CQA), an analytical target profile (ATP) is proposed. The ideas and concepts conveyed in this article are the outcome of a joint effort of the Pharmaceutical Research and Manufacturers of America (PhRMA) Analytical Technical Group (ATG) and the European Federation of Pharmaceutical Industries and Association’s (EFPIA) analytical design space (ADS) topic team.

An ATP would be defined in the same way that the process control strategy is defined and in the same manner CQAs requiring measurement are identified. The development of appropriate analytical methods is, however, fundamental to establishing product and process control (in a traditional- or a QbD-development approach) and the overall control strategy.

Having defined the ATP, the principles of QbD can be used during method development and evaluation to ensure that an appropriate analytical-measurement technology is selected and that the analytical method is designed to meet its intended performance requirements.

The authors come to the conclusion that “in the desired future state for a QbD-approach based submission, the focus of the analytical-measurement portion of the submission will be to demonstrate a thorough understanding of the requirements for measuring the drug substance / product and process CQAs used to define the design space of the process and describe how this understanding is translated into an ATP. The commitment the company makes will be to ensure that any method used to measure CQAs and quality assurance meets the registered ATP, but there shall be no commitment to follow the detailed analytical methodology provided as an example”.

This would be a radical change to the situation as of today, where once a method was submitted, changes are sometimes difficult to implement, although they might be most beneficial to control and ensure the quality of the product. It would allow the registration of multiple alternative methods and “as multiple methods (alternative methods) may be in use and may be available for regulatory authorities, tools to compare the performance of these alternative methods with others and ensure equivalency will need to be established”.

The authors acknowledge the impact this has, like the investment to provide multiple methods. Their view essentially is that some aspects of a validation package might be taken from prior knowledge and not repeated for the validation purpose. Others might no longer be as relevant in light of a risk-assessment based on such an ATP and might no longer be required or can be associated with broader acceptance criteria. It can be concluded that a reduced volume for validation studies would be necessary, so introducing alternative methods might also go along with less validation work for each method.

This and other new and emerging regulatory trends were discussed and explored in more detail at the PDA Workshop on Analytical Method / Procedure Validation in Vienna, Austria, which was held on 11-12 November 2010. To find out about similar upcoming events please contact PDA.