Arlenda SA

Liège, Belgium

Arlenda SA

Liège, Belgium
Time filter
Source Type

Dispas A.,University of Liège | Lebrun P.,University of Liège | Lebrun P.,Arlenda s.a. | Ziemons E.,University of Liège | And 4 more authors.
Journal of Chromatography A | Year: 2014

Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). © 2014 Elsevier B.V.

Rozet E.,University of Liège | Ziemons E.,University of Liège | Marini R.D.,University of Liège | Boulanger B.,Arlenda SA | Hubert P.,University of Liège
Analytical Chemistry | Year: 2012

The concept of quality by design (QbD) has recently been adopted for the development of pharmaceutical processes to ensure a predefined product quality. Focus on applying the QbD concept to analytical methods has increased as it is fully integrated within pharmaceutical processes and especially in the process control strategy. In addition, there is the need to switch from the traditional checklist implementation of method validation requirements to a method validation approach that should provide a high level of assurance of method reliability in order to adequately measure the critical quality attributes (CQAs) of the drug product. The intended purpose of analytical methods is directly related to the final decision that will be made with the results generated by these methods under study. The final aim for quantitative impurity assays is to correctly declare a substance or a product as compliant with respect to the corresponding product specifications. For content assays, the aim is similar: making the correct decision about product compliance with respect to their specification limits. It is for these reasons that the fitness of these methods should be defined, as they are key elements of the analytical target profile (ATP). Therefore, validation criteria, corresponding acceptance limits, and method validation decision approaches should be settled in accordance with the final use of these analytical procedures. This work proposes a general methodology to achieve this in order to align method validation within the QbD framework and philosophy. β-Expectation tolerance intervals are implemented to decide about the validity of analytical methods. The proposed methodology is also applied to the validation of analytical procedures dedicated to the quantification of impurities or active product ingredients (API) in drug substances or drug products, and its applicability is illustrated with two case studies. © 2011 American Chemical Society.

Rozet E.,University of Liège | Marini R.D.,University of Liège | Ziemons E.,University of Liège | Boulanger B.,Arlenda s.a. | Hubert P.,University of Liège
Journal of Pharmaceutical and Biomedical Analysis | Year: 2011

Bioanalytical method validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application in order to trust the critical decisions that will be made with them. Even if several guidelines exist to help perform bioanalytical method validations, there is still the need to clarify the meaning and interpretation of bioanalytical method validation criteria and methodology. Yet, different interpretations can be made of the validation guidelines as well as for the definitions of the validation criteria. This will lead to diverse experimental designs implemented to try fulfilling these criteria. Finally, different decision methodologies can also be interpreted from these guidelines. Therefore, the risk that a validated bioanalytical method may be unfit for its future purpose will depend on analysts personal interpretation of these guidelines. The objective of this review is thus to discuss and highlight several essential aspects of methods validation, not only restricted to chromatographic ones but also to ligand binding assays owing to their increasing role in biopharmaceutical industries. The points that will be reviewed are the common validation criteria, which are selectivity, standard curve, trueness, precision, accuracy, limits of quantification and range, dilutional integrity and analyte stability. Definitions, methodology, experimental design and decision criteria are reviewed. Two other points closely connected to method validation are also examined: incurred sample reproducibility testing and measurement uncertainty as they are highly linked to bioanalytical results reliability. Their additional implementation is foreseen to strongly reduce the risk of having validated a bioanalytical method unfit for its purpose. © 2010 Elsevier B.V.

Rozet E.,University of Liège | Lebrun P.,University of Liège | Hubert P.,University of Liège | Debrus B.,University of Geneva | Boulanger B.,Arlenda s.a.
TrAC - Trends in Analytical Chemistry | Year: 2013

Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a Quality by Design (QbD) approach, there have been many discussions on the opportunity for analytical method developments to follow a similar approach. A key component of the QbD paradigm is the definition of the Design Space (DS) of analytical methods where assurance of quality is provided. Several DSs for analytical methods have been published, stressing the importance of this concept. This article aims to explain what an analytical method DS is, why it is useful for the robust development and optimization of analytical methods and how to build such a DS. We distinguish the usual mean response surface approach, overlapping mean response surfaces and the desirability function as only they correctly define a DS. We also review and discuss recent publications assessing the DS of analytical methods. © 2012 Elsevier Ltd.

Rozet E.,University of Liège | Marini R.D.,University of Liège | Ziemons E.,University of Liège | Hubert P.,University of Liège | And 3 more authors.
TrAC - Trends in Analytical Chemistry | Year: 2011

Guidelines ISO 17025 and ISO 15189 aim to improve the quality-assurance scheme of laboratories. Reliable analytical results are of central importance due to the critical decisions that are taken with them. ISO 17025 and ISO 15189 therefore require that analytical methods be validated and that laboratories can routinely provide the measurement uncertainty of the results of measurements. To evaluate the fitness of purpose of analytical methods, total error is increasingly applied to assess the reliability of results generated by analytical methods. However, the ISO requirement to estimate measurement uncertainty seems opposed to the concept of total error, leading to delays in laboratories implementing ISO 17025 and ISO 15189 and confusion for the analysts. This article therefore aims to clarify the divergences between total error and measurement uncertainty, but also to discuss their main similarities and emphasize their implementation. © 2011 Elsevier Ltd.

Mutsvari T.,UCB BioPharma SPRL | Mutsvari T.,Arlenda S.A. | Tytgat D.,UCB BioPharma SPRL | Tytgat D.,Sanofi S.A. | Walley R.,UCB Pharma
Pharmaceutical Statistics | Year: 2016

Bayesian methods are increasingly used in proof-of-concept studies. An important benefit of these methods is the potential to use informative priors, thereby reducing sample size. This is particularly relevant for treatment arms where there is a substantial amount of historical information such as placebo and active comparators. One issue with using an informative prior is the possibility of a mismatch between the informative prior and the observed data, referred to as prior-data conflict. We focus on two methods for dealing with this: a testing approach and a mixture prior approach. The testing approach assesses prior-data conflict by comparing the observed data to the prior predictive distribution and resorting to a non-informative prior if prior-data conflict is declared. The mixture prior approach uses a prior with a precise and diffuse component. We assess these approaches for the normal case via simulation and show they have some attractive features as compared with the standard one-component informative prior. For example, when the discrepancy between the prior and the data is sufficiently marked, and intuitively, one feels less certain about the results, both the testing and mixture approaches typically yield wider posterior-credible intervals than when there is no discrepancy. In contrast, when there is no discrepancy, the results of these approaches are typically similar to the standard approach. Whilst for any specific study, the operating characteristics of any selected approach should be assessed and agreed at the design stage; we believe these two approaches are each worthy of consideration. © 2015 John Wiley & Sons, Ltd.

Pestieau A.,University of Liège | Krier F.,University of Liège | Lebrun P.,Arlenda S.A. | Brouwers A.,Galephar Research Center M F | And 2 more authors.
International Journal of Pharmaceutics | Year: 2015

The aim of this study was to develop a formulation containing fenofibrate and Gelucire® 50/13 (Gattefossé, France) in order to improve the oral bioavailability of the drug. Particles from gas saturated solutions (PGSS) process was chosen for investigation as a manufacturing process for producing a solid dispersion. The PGSS process was optimized according to the in vitro drug dissolution profile obtained using a biphasic dissolution test. Using a design of experiments approach, the effects of nine experimental parameters were investigated using a PGSS apparatus provided by Separex® (Champigneulles, France). Within the chosen experimental conditions, the screening results showed that the drug loading level, the autoclave temperature and pressure, the connection temperature and the nozzle diameter had a significant influence on the dissolution profile of fenofibrate. During the optimization step, the three most relevant parameters were optimized using a central composite design, while other factors remained fixed. In this way, we were able to identify the optimal production conditions that would deliver the highest level of fenofibrate in the organic phase at the end of the dissolution test. The closeness between the measured and the predicted optimal dissolution profiles in the organic phase demonstrated the validity of the statistical analyses. © 2015 Elsevier B.V. All rights reserved.

Rozet E.,University of Liège | Ziemons E.,University of Liège | Marini R.D.,University of Liège | Boulanger B.,Arlenda SA | Hubert P.,University of Liège
Analytica Chimica Acta | Year: 2012

Dissolution tests are key elements to ensure continuing product quality and performance. The ultimate goal of these tests is to assure consistent product quality within a defined set of specification criteria. Validation of an analytical method aimed at assessing the dissolution profile of products or at verifying pharmacopoeias compliance should demonstrate that this analytical method is able to correctly declare two dissolution profiles as similar or drug products as compliant with respect to their specifications. It is essential to ensure that these analytical methods are fit for their purpose. Method validation is aimed at providing this guarantee. However, even in the ICHQ2 guideline there is no information explaining how to decide whether the method under validation is valid for its final purpose or not. Are the entire validation criterion needed to ensure that a Quality Control (QC) analytical method for dissolution test is valid? What acceptance limits should be set on these criteria? How to decide about method's validity? These are the questions that this work aims at answering. Focus is made to comply with the current implementation of the Quality by Design (QbD) principles in the pharmaceutical industry in order to allow to correctly defining the Analytical Target Profile (ATP) of analytical methods involved in dissolution tests. Analytical method validation is then the natural demonstration that the developed methods are fit for their intended purpose and is not any more the inconsiderate checklist validation approach still generally performed to complete the filing required to obtain product marketing authorization. © 2012 Elsevier B.V.

Lebrun P.,University of Liège | Boulanger B.,Arlenda S.A | Debrus B.,University of Liège | Lambert P.,University of Liège | Hubert P.,University of Liège
Journal of Biopharmaceutical Statistics | Year: 2013

The International Conference for Harmonization (ICH) has released regulatory guidelines for pharmaceutical development. In the document ICH Q8, the design space of a process is presented as the set of factor settings providing satisfactory results. However, ICH Q8 does not propose any practical methodology to define, derive, and compute design space. In parallel, in the last decades, it has been observed that the diversity and the quality of analytical methods have evolved exponentially, allowing substantial gains in selectivity and sensitivity. However, there is still a lack of a rationale toward the development of robust separation methods in a systematic way. Applying ICH Q8 to analytical methods provides a methodology for predicting a region of the space of factors in which results will be reliable. Combining design of experiments and Bayesian standard multivariate regression, an identified form of the predictive distribution of a new response vector has been identified and used, under noninformative as well as informative prior distributions of the parameters. From the responses and their predictive distribution, various critical quality attributes can be easily derived. This Bayesian framework was then extended to the multicriteria setting to estimate the predictive probability that several critical quality attributes will be jointly achieved in the future use of an analytical method. An example based on a high-performance liquid chromatography (HPLC) method is given. For this example, a constrained sampling scheme was applied to ensure the modeled responses have desirable properties. © 2013 Taylor and Francis Group, LLC.

Sacre P.-Y.,University of Liège | Lebrun P.,Arlenda S.A. | Chavez P.-F.,University of Liège | Bleye C.D.,University of Liège | And 6 more authors.
Analytica Chimica Acta | Year: 2014

During galenic formulation development, homogeneity of distribution is a critical parameter to check since it may influence activity and safety of the drug. Raman hyperspectral imaging is a technique of choice for assessing the distributional homogeneity of compounds of interest. Indeed, the combination of both spectroscopic and spatial information provides a detailed knowledge of chemical composition and component distribution.Actually, most authors assess homogeneity using parameters of the histogram of intensities (e.g. mean, skewness and kurtosis). However, this approach does not take into account spatial information and loses the main advantage of imaging. To overcome this limitation, we propose a new criterion: Distributional Homogeneity Index (DHI). DHI has been tested on simulated maps and formulation development samples. The distribution maps of the samples were obtained without validated calibration model since different formulations were under investigation. The results obtained showed a linear relationship between content uniformity values and DHI values of distribution maps. Therefore, DHI methodology appears to be a suitable tool for the analysis of homogeneity of distribution maps even without calibration during formulation development. © 2014 Elsevier B.V.

Loading Arlenda SA collaborators
Loading Arlenda SA collaborators