You have been redirected to your local version of the requested page

Privacy Policy

I allow Metrohm AG and its subsidiaries and exclusive distributors to store and process my data in accordance with its Privacy Policy and to contact me by e-mail, telephone, or letter to reply to my inquiry and for advertising purposes. I can withdraw this consent at any time by sending an e-mail to info@metrohm.com.

This field is required.

Have you ever performed a conductivity measurement and obtained incorrect results? There are several possible reasons for this. In this post, I want to show you how you may overcome some of these issues.

By itself, conductivity measurement is performed quite easily. One takes a conductivity cell and a suitable measuring device, inserts the conductivity cell into the sample solution and reads the value given. However, there are some challenges such as choosing the right sensor, the temperature dependency of conductivity, or the CO2 uptake, which falsify your results.

The following topics will be covered in the rest of this post (click to jump to the topic):

So many measuring cells – which one to use?

The first and most important question about conductivity measurement is: which sensor is the most suitable for your application? The measuring range is dependent on the cell constant of your conductivity cell, and therefore this choice requires a few considerations:

  • What is the expected conductivity of my sample?
  • Do I have a broad range of conductivities within my samples?
  • What is the amount of sample I have available for measurement?

There are different types of conductivity measuring cells available on the market. Two-electrode cells have the advantage that they can be constructed within a smaller geometry, and are more accurate at low conductivities. On the other hand, other types of measuring cells show no influences towards polarization, have a larger linear range, and are less sensitive towards contaminations.

https://s7e5a.scene7.com/is/image/metrohm/cell-constants-electrodes?ts=1645799655365&dpr=off
Figure 1. Illustration of the range of applications for different conductometric measuring cells offered by Metrohm.

Figure 1 shows you the wide application range of sensors with different cell constants.

As a general rule: Sensors with a low cell constant are used for samples with a low conductivity and sensors with high cell constants should be used for high conductivity samples.

To get more information, check out our Electrode finder and select «conductivity measurement».

Find the perfect electrode for your application with the Metrohm Electrode Finder!

Determination of the cell constant

Each conductivity cell has its own conductivity cell constant and therefore needs to be determined regularly. The nominal cell constant is dependent of the area of the platinum contacts and the distance between the two surfaces:

2020/04/06/conductivity-improvements/_3

K :  Cell constant in cm-1
Aeff :  Effective area of the electrodes in cm2
delectrodes :  Distance between the electrodes in cm

However, no sensor is perfect and the effective cell constant does not exactly agree with the ideal cell constant. Thus, the effective cell constant is determined experimentally by measuring a suitable standard. Its measured conductivity is compared to the theoretical value:

2020/04/06/conductivity-improvements/_4

K :  Cell constant in cm-1
ϒtheor:  Theoretical conductivity of the standard at the reference temperature in S/cm
Gmeas :  Measured conductance in S

With increasing lifetime usage, the properties of the measuring cell might change. Changing its properties also means its cell constant changes. Therefore, it is necessary to check the cell constant with a standard from time to time and to perform a redetermination of the cell constant if necessary.

Temperature dependency of the conductivity

Have you ever asked yourself why the conductivity is normally referred to at 20 °C or 25 °C in the literature? The reasoning is that the conductivity itself is very temperature-dependent and will vary with different temperatures. It is difficult to compare conductivity values measured at different temperatures, as the deviation is approximately 2%/°C. Therefore, please make sure you measure in a thermostated vessel or you use a temperature compensation coefficient.

What is a temperature compensation coefficient anyway?

The temperature compensation coefficient is a correction factor which will correct your measured value at a certain temperature to the defined reference temperature. The factor itself depends on the sample matrix and is different for each sample.

2020/04/06/conductivity-improvements/_5
Figure 2. The blue curve shows the actual conductivity (mS/cm) and the orange line is a linear temperature compensation. The temperature compensation here varies from 2.39–4.04 %/°C.

For example, if you measure a value of 10 mS/cm at 24 °C, then the device will correct your value with a linear correction of 2%/°C to 10.2 mS/cm to the reference temperature of 25 °C. This feature of linear temperature compensation is very common and is implemented in most devices.

However, the temperature compensation coefficient is not linear for every sample. If the linear temperature compensation is not accurate enough, you can also use the feature of recording a temperature compensation function. There, you will measure the conductivity of your sample at different temperatures and afterwards fit a polynomial function though the measured points. For future temperature corrections, this polynomial function will be used, and more accurate results will be obtained.

And… what about the conductivity standard?

Which standard do I have to choose?

In contrast to pH calibration, the conductivity cell only requires a one-point calibration. For this purpose, you need to choose a suitable standard which has a conductivity value in the same range as your sample and is inert towards external influences.

As an example, consider a sample of deionized water, which has an expected conductivity of approximately 1 µS/cm. If you calibrate the conductivity cell with a higher conductivity standard around 12.88 mS/cm, this will lead to an enormous error in your measured sample value.

Most conductivity cells will not be suitable for both ranges. For such low conductivities (1 µS/cm), it is better to use a 100 µS/cm conductivity standard. While lower conductivity standards are available, proper handling becomes more difficult. For such low conductivities, the influence of CO2 influence increases.

Last but not least: To stir or not to stir?

This is a controversial question, as stirring has both advantages and disadvantages. Stirring enables your sample solution to be homogeneous, but it might also enhance the carbon dioxide uptake from ambient air.

Either way, it does not matter if you choose to stir or not to stir, just make sure that the same procedure is applied each time for the determination of the cell constant, and for the determination of the conductivity of your sample. Personally, I recommend to stir slightly, because then a stable value is reached faster and the effect of carbon dioxide uptake is almost negligible.

 

Summary

It is quite easy to perform conductometric measurements, but some important points should be considered thoroughly before starting the analysis, like the temperature dependency, choice of suitable conductometric measuring cell, and the choice of calibration standard. Otherwise false results may be obtained.

Author
Kalkman

Iris Kalkman

Product Specialist Titration
Metrohm International Headquarters, Herisau, Switzerland

Contact