Manufacturing products of the highest quality is a must, especially in the pharmaceutical and food industries. This requires accurate, reproducible, and simple analysis methods that eliminate human errors as much as possible. Automated titration is one such solution that offers additional time and cost savings to laboratories.
After applying automation to a titration method, how can you ensure that the chosen method also delivers a reliable result? And how do you know that it is suitable for the analysis of your analyte(s)? This requires method validation of a titration, which includes standardization of the titrant as well as determination of accuracy and precision, linearity, and specificity.
USP General Chapter <1225> Validation of Compendial Procedures and ICH Guidance Q2(R1) Validation of Analytical Procedures: Text and Methodology define the validation elements – some of the most important ones are described in the following article.
These include (click to go directly to each section):
Dilution and weighing errors as well as the constant aging of all titrants lead to changes in the concentration of the titrant. To obtain results that are as reliable as possible, the most accurate titrant concentration is a prerequisite. Standardization of the titrant is therefore an integral part of a titration method validation. The standardization procedures for various titrants are described in the Volumetric Solution section of USP – NF as well as in the free Application Bulletin AB-206 below regarding the titer determination in potentiometry.
The titrant to be used in the validation must first be standardized against a primary standard or a pre-standardized titrant. It is important that the standardization step and the sample titration are carried out at the same temperature.
Primary standards are characterized by the following properties:
- high purity and stability
- low hygroscopicity (to minimize weight changes)
- high molecular weight (to minimize weighing errors)
The use of a standard substance (primary standard) allows accuracy to be assessed.
For more information about titrant standardization, check out our other related blog posts:
Accuracy is defined as the proximity of the result to the true value. Therefore, it provides information about the bias of a method under validation. Accuracy should be determined over the entire concentration range.
Precision is usually expressed as the standard deviation (SD) or relative standard deviation (RSD). It expresses how well the individual results agree within an analysis of a homogeneous sample. Here, it is important that not only the analysis itself but also all sample preparation steps are performed independently for each analysis.
Precision is evaluated in three levels:
- Repeatability: the precision achieved by a single analyst for the same sample in a short period of time using the same equipment for all determinations.
- Intermediate precision: analysis of the same sample on different days, by different analysts and with different equipment, if possible, within the same laboratory.
- Reproducibility: precision obtained by analyzing the same sample in different laboratories.
Determination of both accuracy and precision is necessary, as only the combination of both factors ensures correct results (Figure 1).
For titration, accuracy and repeatability are usually determined together. At least two to three determinations at three different concentration levels (in total six to nine determinations) are recommended. For assays, the recommendation is to use a concentration range of 80% to 120% of the intended sample weight.
Linearity expresses whether a particular method gives the correct results over the concentration range of interest. Since titration is an absolute method, linearity can usually be determined directly by varying the sample size and thus the analyte concentration.
To determine the linearity of a titration method in the range of interest, titrate at least five different sample sizes and plot a linear regression of the sample volume against the titration volume consumed (Figure 2). The coefficient of determination (R2) is used to assess linearity. The recommendation is to use a concentration range of 80% to 120% of the intended sample weight.
Impurities, excipients, or degradation products are among the many components that may be present in a sample. Specificity is the ability to evaluate the analyte without interference from these other components. Therefore, it is necessary to demonstrate that the analytical procedure is not affected by such compounds. This is the case when either the equivalence point (EP) found is not shifted by the added impurities or excipients, or in the event it is shifted that a second EP corresponding to these added components can be observed when a potentiometric sensor is used for indication.
Specificity may be achieved by using suitable solvents (e.g., non-aqueous titration instead of aqueous titration for acid-base titration) or titration at a suitable pH value (e.g., complexometric titration of calcium at pH 12, where magnesium precipitates as magnesium hydroxide).
How can this be implemented in practice? The titrimetric determination of potassium bicarbonate with hydrochloric acid will serve as an example here.
Follow along with more details in our free White Paper:
In this case, potassium carbonate is expected as an impurity with pkb values of 8.3 and 3.89. This makes it possible to separate the two species during the acid-base titration. Figure 3 shows the comparison of a curve overlay of the titration curves of potassium bicarbonate with and without added potassium carbonate.
The lower titration curve corresponds to the solution containing both potassium bicarbonate and potassium carbonate. Two EPs are found here: the first EP can be assigned to the added potassium carbonate, while the second corresponds to the sum of potassium bicarbonate and potassium carbonate. The curve at the top of the figure clearly shows only one EP for the potassium bicarbonate solution without impurities.
If you follow the recommendations above, you will be ready for titration method validation – and now it's time to get started!
Using potentiometric autotitration instead of manual titration increases the accuracy and reliability of your results. In addition, the use of an autotitrator ensures that critical regulatory compliance requirements, such as data integrity are met. Read more in our free Application Note.
Right from the start, Metrohm products provide peace of mind and confidence in the quality of the data you produce with proper IQ/OQ.
If you would like to learn more about Metrohm Analytical Instrument Qualification, have a look at our two blog posts dedicated to this important topic.
Additional security is also provided, e.g., by Metrohm Buret Calibration which ensures that the accuracy and precision of your dosing device are within the required tolerances. Traceable monitoring of the performance and function of the instrument through regular re-qualifications and tests is therefore a given.