You have been redirected to your local version of the requested page

Sensors are great, but they have their limitations" - that’s where analysers come in!

In this episode of The Process Industry Informer Podcast, Paul Hamilton (Metrohm Process Analytics Business Manager) joins Dave Howell to discuss simple and innovative ways for Process Engineers to perform complex chemical analysis.

Metrohm is best known for producing analytical laboratory hardware and software, but Paul explains how the process analysis side of the business brings all of the typical laboratory analysis techniques and integrates them into the process plant, producing lab-quality data in the plant, fully automated.

Compared to some other countries, the UK is behind when it comes to the process industry, but there is a shift happening, and now is the time to innovate and automate. There are countless ways in which a process can be automated, and most people don’t even realise it! A lot of companies are reliant on intermittent data from a lab, and that creates a bottleneck. Whereas automating your process provides real-time detailed data, which allows you to be proactive and make fast and accurate decisions; this, in turn, eliminates downtime, reduces costs, and raises efficiency.

Paul discusses real-life examples of companies and industries that have embraced automating their process and the results they have seen… often automation is much simpler than people think! Paul talks about how you can retrofit an analyser into your process, and how surprisingly easy that can be.

Lastly, Paul talks over what’s next for the process industry… spectroscopy?…

Don’t miss this insightful and eye-opening episode!

Search "PII Podcast" on your favourite podcast platform or listen here!

Prefer to read the content of the Podcast? Enjoy the transcription below!

Dave: Welcome to the Process Industry Informer podcast, where process professionals can learn and share ideas and information. Welcome to the latest podcast from Metrohm, one of the world's most trusted, manufactured, high precision instruments for laboratory and process analysts. I'm your host, David Howell. Joining me today is Paul Hamilton, Process Analytics Business Manager at Metrohm UK and Ireland. Welcome, Paul.

Paul: Thank you for having me. 

Dave: As I said, I was reading sort of what the company has been doing and also how you sort of innovated in this space. It's a very interesting conversation we're going to have. But before we do that, I think it will be useful, I think, to get a little bit of background about yourself. And also maybe a little bit about the company. Maybe there may be obviously listeners who don't know Metrohm. So who's the company? What do you guys get up to? And maybe a little bit about your background as well, Paul.

Paul: So I'm a chemist by training. I almost immediately left chemistry behind and joined the oil and gas industry. That's what a lot of people seem to do. I joined the oil and gas industry straight out of uni basically. And I spent about seven years doing that and I was overseas for a bit. So when I came back to the UK, I joined Metrohm, a bit closer to what my actual training in chemistry. So. I've been working with Metrohm now for about seven years as well. Most of that has been in the process side of things. So process chemistry, process automation. I brought a lot of my engineering background from oil and gas and some of my chemical background from my formal training. So it's a nice job for me, a nice mix for me. Most people that have heard of Metrohm, the reason you'll have heard of them for the most part is for their laboratory hardware. Big, big Swiss company, very well known in the analytical instrumentation field, but less well known in the process environment. But what the process side of the company does is it takes all the same lab hardware. So the same, we did titration, photometry, ion chromatography, voltammetry, spectroscopy. There's lots of different things that the company can do on the bench. And on the process side, we take the same hardware and we just integrate it into the process plant so that instead of taking samples from various points in the process and taking them to the lab and having an analyst run them. We produce that sort of lab quality data, lab quality analysis, but in the plant, fully automated.

Dave: I find that a very interesting shift. It always seemed to me that I guess incumbent process, a particularly complex process, maybe as you say, in holding gas, chemical, I would even say automotive, these kinds of processes. Yes, we've been used to sort of, yeah, how do we analyse that? How do we test this stuff? We could stick a sensor on something, et cetera, but is that going to be good enough moving forward? I speak to a lot of process managing engineers and they're evolving their processes. It could be because of new technology, new strategy, it could be a completely new process, but they are looking for, I guess, more automation. They're looking for how can we be more accurate with that kind of testing, and particularly the data that comes off those sensors, so we can make our plan more efficient, less downtime, and of course, ultimately reduce costs. Now, are you finding that those are the kinds of conversations that you're having more and more with process? Companies like yours, how can we innovate in this space? Clearly, you've been thinking about this for some time. How can we do lab level kinds of testing? Can we do that with implants and with in process? Well, yes, you can. Is that often a revelation for process managers or have they been aware of that but maybe hadn't got the mechanics right or hadn't got, I guess, the tools to be able to implement that?

Paul: It's unfortunately a revelation, if I'm honest. sure. They probably have a lot of people, they say, Oh, you couldn't automate that. You think, well, you can actually. You can. We have automated that. We have use cases for that. That's a really common conversation. When people are building plants, they're building processes. They're obviously have in their mind that they can automate certain things. They know they can get differential pressure sensors. They know they can get flow sensors. They know they can get pH probes. So that's part of their design process. But when it comes to this more complex stuff, oftentimes they're not even thinking about it. So, when we had it at the right time, the perfect time for us to hit these things is when people are building and then they do contact us and then they can fold into the design schedule. That's great, but you can do it retrospectively. It just comes with more challenges when you do it retrospectively. So it's nice for us to try and get these things whenever people are building new plants, but it is improving. I would say that it's in the UK specifically, we have some way to go on that. If you compare us to sort of our German colleagues or the Americans, even the Chinese, we're kind of a step behind. I think there's a lot more automation that goes into these plants and other areas. I mean, I had a conversation with one of the process engineers on a project that we did and the communication and the signaling, we asked them, well, do you want to do this over some communication protocols? You don't have to be running hundreds of copper cables everywhere. And they said, oh no, that's too modern for us. And we're thinking, well, Modbus protocols from like 1979, you know, and that was too modern for them. I think that it's a bit of a step change that UK manufacturing has to go through and it's happening, but I think it's happening slowly.

Dave: Yeah, I think that's right. I think it's that whole thing, isn't it, about if it ain't broke, don't fix it. We don't want to sort of muck around with something that's working. It's working, just leave that alone. I guess the theory is, as you just pointed out, if it's an established process and it could be highly complex, how do we do this stuff? They come to Metrohm and say, well, we'd love to do this, but... Is it going to be massive amount of downtime? Can we retrofit existing maybe analytical application we already have to obviously make sense of the data coming off what you guys are putting in place? These kinds of questions. It seems to me that it also requires a mindset shift. If you want to do more of this, I think every process is trying to automate more and certainly generate more data. Then you're going to have to make a change. So often is that the case then? Can you give us a sort of feel for maybe a couple of  analytical aspects of process. How would they actually be put into say an existing process setup? Just to give a feel for how either easy or complex or something in the middle it is to actually put in place these kinds of technologies to gather the data, which I think every process that a manager is trying to get a hold of.

Paul: I'll give you a couple. First would be an easy one. So a really easy one. If you think of surface finishing lines, metal finishing lines, you just have rows of atmospheric. tanks, so a cleaning tank, a stripping tank, an etch tank, a plating tank, whoever it is. So you have these big lines of tanks. They're generally ambient temperatures. I know they can be different, but for the most part, they can be ambient temperatures, ambient pressure, simplistic chemistry. You can just have acid baths, caustic baths, ferric baths. They're not hugely complex in their chemistry. And those things are really easy to automate. You can come along and put a unit next to the line that automatically samples, automatically tests, automatically reports back. to the control system or PLC, whatever is running the line. That stuff that's really easy to retrofit. And actually there's not a massive amount of downsides. So we do this for some major household name automotive aerospace manufacturers already. What it means is the operators and the process owners are in control of the tanks and of the bath, they don't have to wait to get results in the lab. You can just put things through the process whenever they feel like it, and they have that data stored. So that's really easy to do. And those retrofits are easy. There's other cases when we've done retrofits with difficult samples. In one case, we had a customer with a quite challenging sample, high temperature sample, lots of solids. So they would feed the sample into exchange columns, ion exchange columns, which if no one, I mean, I'm assuming this audience will be familiar with what an ion exchange column is, but if you weren't, it's like basically a massive Brita filter, water filter. You put it through, it takes calcium, magnesium out. And what the lab was doing was on the end of these big filters, it was taking a sample every hour, analysing it in the lab. reporting that to the control room and the process engineers are making decisions about when to switch the columns to new columns based on that, because you have high calcium coming in at the top, you want low calcium coming out at the bottom so that you don't have all this scale further than.

Dave: That's a huge possible delay, isn't it? Before you can make a decision. And I think that's the thing to get across, I think with putting these kinds of sensor technologies in place, that it completely mitigates that you can have, I imagine, real time, because it's, it can sample as however you want to set these things up. In that scenario, for instance, it could take a couple of hours to get the results back. So you might think two hours before you make a decision about how to change something on your plans, which could be hugely detrimental, it seems to me. Or if you can move that to, I'm assuming some of this is actually real time or near to real time. That's a huge gain.

Paul: Well, we provide analysers, which are slightly more complex than sensors, so we can get into what that actually means. But some of them are instantaneous, but some of them are batch-wise technology. So you have to take a sample, analyse something. In that case for those columns, we were talking once every five, six minutes versus two hours. And as you say, what used to happen there was the sometimes the lab data would look a bit strange and the process guys would say, well, get another sample and run it again, because we're not sure we want to switch the columns. And then you get another result. And by that point, it's gone way off the scale. You're pumping all this calcium downstream and there's panic stations. They switch over. When you have this real high resolution of data, you can see the trends more clearly and you can sort of trust. that what's coming out of that is, okay, we can see this sharp line coming up. We know the column is going off. Let's switch it off. The time scales of getting that back from the lab is slow, but it's the fact that sometimes having more resolution of data is almost always better for controlling these processes.

Dave: Yes. I think you're absolutely right. I mean, most process managers, you say, if we can't see it, we can't do anything. We don't have the data. More data is better. You know, we want more of this information to a point. I think it is a point where we need to obviously look at the information we're getting, and if there's lots of it. which bits of that should be paid attention to. I think in this scenario though, it's pretty obvious if you got a result back in, you know what to do. I think that's a given. It does seem to me that you can also go one step further than that. Do you often find that when you're putting these kinds of technologies in place, I think you already mentioned it, you haven't mentioned it's proactive, but that's kind of what it is. It seems to me to be proactively looking at plant and machinery and process, particularly with whatever you're trying to measure and sample. And that gives you a proactive view of really what could potentially be downtime later on, downstream, or even an interesting way of, I guess, not second guessing what's going to happen because you have the data, but you can then start to be more proactive about how you do change things really when you're looking at potentially a failure. I think that's a very important thing for process managers. Can we be more proactive in that space?

Paul: Well, yes, you can if you have more data. If you have more data, the interesting thing about some of these projects, that one in particular, that was a retrofit and it was quite a complex retrofit to be fair, but we did it. It was a good project. And in that case, the interesting thing was we had discussions with the process engineers because there were some questions about data and they said, well, that can't be real. We don't see that. That must be some kind of artifact. And it wasn't. So they were getting this data they'd just never seen before. They were able to tell things about column capacities and stuff that they'd never seen. So I think there are some things like that when you do projects and they kind of grow arms and legs because the data that you get is so valuable, it allows them to do things that they weren't even considering doing before they automated.

Dave: It's a revelation. It moves things on. Something that was invisible or was completely out of focus is suddenly in sharp relief and you could start to make decisions about that. I think that's the important thing to sort of get across that you suddenly have more information to make, which can be quite serious and very time-sensitive and also cost-sensitive decisions about your plant. I think that's important to sort of understand. Also, I think it's important. You also mentioned it just a second ago. It is a question I wanted to sort of get into fairly early. We are not talking about per se sensor technology, are we? This is not a sensor you can put onto a valve. It's not that. I think it'd be important to put some definitions in place about what you guys can actually do with analysers rather than sensors. Maybe sensors is the wrong word.

Paul: Yeah, it is a key distinction because as you say, sensors are great. We say this to a lot of people. We are realistic. So when we go into meetings and people say, can we do this? If there's something that you can do with a sensor, then a sensor is the easier option always. If it's some information you can get from a sensor and there are some inaccuracies in sensors and some compromise there, but if that's good enough for your process, then that's great. And that's what we would always counsel people to use if they can. Well, where we tend to operate is in the case where there are no sensors. So it's more complex analysis that's required. And usually that's to take a sample. This is automated, fully automated, but you take a sample from a process, you do some kind of analysis, and we try and keep that analysis as close to what the lab method is, as possible. So it's an apples to apples comparison. And in our case, sometimes we are competing against our own kit in the lab, if that's titration kit or ion chromatography kit, whatever it might be. You are taking a sample and doing a batch-wise analysis. And depending on what that is, that can be five minutes, it could be 20 minutes. It's really dependent on the implementation. But what that means is you have like a unit and it's like an IP 66 unit that is a field unit that goes in the plant. But it has pumps and valves and has analysis hardware in there. So it's a bit more complex than just installing a sensor onto a line. Sensors are great, but they have limitations. And there are certain things, for example, if you're doing a batch reaction, you're doing a polymerisation reaction and you have a batch reactor. If you want to take a sample and you know what the total acid number is because you're following polymerisation reaction. That's something you're just going to struggle to do with a simple sensor. You need to have an analyser to do that because it's just a more complex parameter. In this scenario, we're thinking about analysts and what that could do for your process,

Dave: Absolutely you're taking samples, but that's not a sensor. You may have sensors on some area, absolutely telling you different kinds of data. But for this conversation, it is about how you're doing an analysis on certain aspects of your process. That's the important thing to get across. Now, it also seems to me that when you start to think about how you can do these kinds of quite detailed pieces of analytics, that opens your plant to something very interesting, it seems to me. It seems to me that you can then start to look over time. Then you start to get information about trends. You start to get information about what's happening in certain areas of your plant. how certain sections of your plant or even processes interacting with each other, again, which you may not have seen before. So you often, are you seeing that as well? When something's been in place for a while, the feedback you're getting is that we suddenly start to see more in the way of trends, et cetera, what's happening. And we can be more proactive than again, we can look at what's happening here, but we couldn't see that before. These two things interact together. We had no clue that was happening. And that gives us these kinds of long-term pieces of analysts, which again, massive amounts of data, but we can understand that because we have the analytics in place.

Paul: Yeah, exactly. Numerous cases of that happening. I mean, the one I just mentioned is they learned a lot of things about the process, which they didn't know just because they had volumes of data. We had one in the water industry as well when we had monitoring manganese levels. We had routine spikes and again, customers saying, well, this can't be realistic. These spikes aren't realistic. And it turned out they had some filters that were back flushing themselves. And every so often when they would back flush, you get these spikes coming through. line them up almost exactly, but they didn't know. They just didn't know that this was happening. So I think we're already there with physical parameters. People have got so much data on that. If you want to build, there's a lot of people that do this with analytics and this cloud data and building these models and digital twins and everything. And that's great when you have substantial amount of data on physical parameters, but chemical parameters, we just aren't there yet. This is part of what we do is just really try and ramp up the amount of data that people have about their process. because really most people are relying on intermittent samples from the lab. And that is the main data source. And it's a, it is a bottleneck, but it's not only a bottleneck with regard to frequency, it's just the fact that, as you say, if you want to do these more complex things with data, you need to have a pool of a sizable amount of data to do it, which most people just don't have.

Dave: No, it's one of those areas which is evolving so quickly. I think a lot of the process they look at that and say, well, we can do that now, which is fantastic, but how is that going to play well with maybe incumbent technologies? Maybe it's useful also to, I guess, put what you guys are doing into context. So we have some kind of analysts in place. We're pulling the data off that. Do you guys get involved with interfacing that then back to maybe existing plant and machinery UIs or remote access, all of these kinds of things? I'm very interested in telling you, we have the data, we can pull that off. We can do the analysts literally by the side of, as you said, the actual process itself, we can do that, we have the data, the data's pulled off. What do we do with it? How do we analyse it? How do we put that into maybe the context of a wider plant where we do have lots of sensors, et cetera? How do you explain that to say a process manager? So we'll put that in place. We have the data. How are you then going to look at that data for tangible results?

Paul: Yeah, that's up to them. I mean, that's the problem is people are really, really different. Our systems are capable of interfacing with almost anything. So we have some customers that want one parameter and that's all that they want. They want one parameter and they want that to go to the control system on a 4-20 milliamp output and that's all they care about. That's it. And some people have these really complex mob bus or profibus or profinet. There's so much communication back and forward between the control system and it takes us quite a bit of time to set that all up on commissioning phases, but everywhere in between. So it's really dependent on what a customer wants. I'm always in favor of, as I say, moving towards proper serial communication. proper running cat five cables because in the fullness of time, if you want more information out of your system, you can get it, but you're kind of limited. If you're just going to run some copper cable, then that's what you have. So it is really dependent on customers. Retrofits tend to be people have, it's better when they're building plants from scratch because then they're a bit more idealistic about what they can get from the system. And they're more open to taking a lot of things. But oftentimes you find with retrofits, they just, I want to know what the zinc concentration is here. And that's all I care about, which is fine. We can do that. The more interesting ones are the ones that are more complex and allow us to do some interesting programming.

Dave: I think it's also important to get across. So someone said, okay, we want to retrofit our plants, which is probably, I would imagine most of the stuff that you guys get involved with. Obviously, you'd be fab to be on the design phase for a new process. That's obviously the ideal. But it's probably going to be most of the stuff is retrofitting. As you said, we want to look at concentration or something or just some kind of spectroscopy or whatever it is. When we're actually going to do that, when it's like, okay, we're going to put this in place as a retrofit. You mentioned a little bit, because you talk a little bit about how that actually happens, the actual process of pulling those kinds of analysers into place. I imagine, I guess there's aspects to that where calibration, et cetera, as you say, the systems it needs to interface with to give the data to the operators. These kinds of questions, I think right at the top of this conversation, you were saying that yeah, it's quite hard to understand because often the misunderstanding is a huge root and branch, huge amounts of downtime and all the rest of it. That's not the case, I think, what you're saying with a lot of this. It can be retrofitted quite quickly with minimum amounts of downtime. But can you sort of talk through a typical process for us? Pick obviously one of the many kinds of analysts you can do. Could you sort of pick, well, maybe if you have a case study in mind, where you did that for a client and how that process actually, I guess, unfolded?

Paul: Yeah, I mean, retrofits are in theory, it depends on exactly what the retrofit is. I think what's key to where we're a bit different with Metrohm, I mean, there are some other, you can type in process analysers into Google and there are other vendors, but I think with Metrohm. One of the real key differences is we have this hugely flexible hardware. So it kind of allows us, we don't really have an analyser, see, we have a technology platform and it allows us to go into customer sites and they want to do something. We can design something. We can make it work for them. When it comes to retrofits, most of the time there's no downtime, realistically, or there's a very limited amount of downtime if you've got to sort of just alter sample pipe work. but it really depends on the customer. For example, again, if we go back to the case of surface finishing, the downtime there for surface finishing automotive aerospace companies is essentially nil because all you need to do is run pipe work into tanks that are atmospheric. It's very easy for us to sample. Obviously, if you're breaking into pipe work and having to install TPCs and flanged connections to get sample out to your system, then yeah, that's obviously some small amount of downtime, but generally it's not invasive as far as connecting to these processes goes. So we tend to go to see what customer wants. We draw something up, we make a proposal. There's usually some interaction with their site engineers or if they've got contracting engineers that do things like paperwork and control systems, there's a little bit of back and forth, but that's obviously ongoing whilst the process is running as per normal. Anyway, in the cases where we're actually doing hard process control, so if we switch certain aspects of controlling the process to an analyser that's running off of results, yeah, that's where we would run, you commission a system. You run it for a number of months and then you switch over to running it full automated control in the example, like columns where you're deciding when to switch over around exchange columns based on results. You wouldn't do that on day one. You would look at the data over a couple of months and then make a call to switch over. But no, I think that's right. I think the point I get across is that it is, it is bespoke. It's whatever you need to do. It's not limited by, we can't do that because we have a particular kind of process or plan. That's not the case.

Dave: As you say, it's an analyst environment. So there's certainly an ecosystem which can be applied to pretty much any kind of plant and process. A lot of this stuff can be, again, I think a lot of people are thinking, oh, it's sensor technology, it's not, it's not like that. Because I think a lot of the fear with that kind of thing is vendor lock here, it's proprietary, what if we want to make a change, it doesn't play well with others, those kinds of worries. But that's not the case with your system because they are completely bespoke. In effect, they are, I guess, vendor agnostic because that's the whole point.

Paul: Yeah. And it's like, okay, I mean, they are bespoke, but I think bespoke sometimes scares people as well because it's as if you're designing something from scratch every time. I think it's bespoke with regard to... your implementation. Okay. So the components that you use to build the system are not bespoke. We use them every time pumps and valves and whatever it is that we need to make your system work. It's the application. It's like if someone comes to a kitchen in your house, your kitchen's bespoke to your situation, but the individual bits and pieces they're using it are not. I think that's where there's a difference with us. There are other people involved in this space, but they tend to have systems that are off the shelf. It can do this and it can do it in this range and it can handle two sample streams. And if it doesn't work for your process. then good luck. They're sort of limited in order to make it flexible. We're exceptionally flexible, which means even if you install something and you said, you know what, it could need an extra one of this or an extra something of this. It's very easy for us to go in and change that. So that flexibility is kind of what makes our systems work because usually you find when people have experience of doing this kind of thing in the past and they say, Oh, that didn't work for us. The reason that work was a very small issue, which could have easily been solved if you just had a little bit of flexibility with respect to programming or hardware. We do, but a lot of other people in this space don't.

Dave: That's true. I think if I was speaking, I wrote sort of flexibility off, I wrote agile, I wrote dynamic on my pad. I think that's the point to get across is that you can put this stuff into place. And if it wasn't working for you, then it's not a case of, oh, well, that's a scrap. We start again. That's not the case. These systems have that flexibility built in. So if you do need to make a small change or even a large change, if something needs to be tweeting some response or... finding some maybe anomaly that is, you'd fear these way over what maybe baseline should be, for instance, what's going on there. You can make the change. That's the whole point of these systems being as agile and I guess, as flexible as they want to be. I mean, I don't want to use it again, in case it's in the wrong context, but it's not, it's not that. It's the agility, it's being able to be flexible.

Paul: That's the whole point to get across. And the nice thing that we have is, I'm sure my directors would be annoyed at this, but the support that we give to customers. I would say this obviously, but there's a bit above and beyond and we have remote access. Some customers don't like it. Some customers love it. I've had calls before from places where we've automated and they're heavily reliant on the technology now. And they call you up on a Saturday morning saying, Oh, the thing's falling over. Can you look? And I've been sat on my daughter's swimming lesson, fixing it on my iPhone because I've got remote access, I've been able to sort of go in and diagnose the problem, fix it.

Dave: That's so powerful though, isn't it?

Paul: It really is. Hugely, hugely. Ideally, I would rather not be called on a Saturday morning. The fact that it's possible is quite cool. But I mean, I think it's cool. From a customer's point of view, absolutely.

Dave: Cause everyone is trying to do remote access and trying to do remote overview, if you like, the whole idea of let's look at our process, put loads of sensors on it, et cetera, which everyone's trying to do. Of course we have the technology to do that. That way we can put any kind of sensor on anything. We can generate the data. I think there is an issue that there's too much of that. So how do you look at what's happening with your plants and force alarms, et cetera, still think it's an issue. So which ones are we paying attention to, et cetera, those kinds of things. But I imagine again, your system looks at, I guess the landscape of process and say, okay, if it's set up correctly and something spikes, you will get an alert because that's the something you need to pay attention to there. And that's very, very important, I think, for maybe downstream where that may affect something which could end up with downtime, which obviously is an absolute no-no in some aspects. The idea that this sort of overview of alert you to what's happening. If you put these kinds of pieces of analysts in place, that's going to give you even more of that. Then hopefully you can then look at your plants in a completely different light and have the data to make those decisions. As you say, if something looks like it's going to fall over, we can do some proactive maintenance there and fix that before it happens.

Paul: I think the key thing to mention with respect to what you look at, what's the thing that you would love to have from your plant? When it comes to sensors, if you're Those things are relatively inexpensive and as you see, not very invasive, right? So you can even put one of them in quite cheap. This is different. So there is an investment here. The hardware is a bit more complex and it requires more of a significant investment. So we tend to only work with people who have a problem that's, wouldn't it be great if I had this parameter all the time? That would be excellent. And then they can afford to actually put the money in place to make it happen. So it tends to be key parameters. And that's what I would say to a lot of process people, if you're at a conference and people are walking past and you have only a couple of minutes to talk to them. That's why I generally say think of something in your process that you think if I could have this every 10 minutes, every five minutes, that would make a huge difference to me. If there's something like that, then yes, it's potentially something we can automate. If that doesn't exist for the process, then they're unlikely to ever go down the route of putting in the time and effort to automate these things. But oftentimes people do have that. They do have a couple of parameters that they would love to have, but they just are not thinking about the fact they can automate them.

Dave: I'd say every process plant, there's got a long list of, Oh, I wish we could do this. or I wish we could analyse this data, I wish we could see this data. And the list is probably endless in some cases, but they would have, I reckon, top two, three or five things that we love to be able to do this. And then you say, well, you can. And that's kind of a revelation, which is good for process. I think it's moves things forward. It sort of shifts the needle massively and then they could start to actually make things happen sort of tangibly. We're almost out of time. What I'd like to ask, I guess, in closing is, I guess, what's next? What's the future for Metrohm? Particularly obviously we're talking about process analytics. And can you give us a sort of feel what's next? Do you feel that there's new technologies coming along? Are you moving into maybe sort of different areas, which sort of I'll just not be very interested.

Paul: There's always something right. So I think at the minute we just very recently released a new Raman spectrometer. So if anyone is into spectroscopy, they'll know what Raman is. Before that, we released a near infrared system, which kind of looks very similar. The same sort of form factor. We're this year releasing an X-ray fluorescence XRF system that will be fully online as well for liquids. But I think the key thing that we've done in the past few years has been, we have this current platform, it's called a 2060 platform, right? That won't mean anything to you unless you Google 2060 Metrohm process, and then you can go and read about it in your own time. But I think that that's one of the key things. There's a lot of functionality and flexibility in the software there. We have the most advanced software package that we've had ever really. And it allows us to do a lot of very, very interesting things. And so we kind of have that technology now. It's a good time for people to be. looking at this and contacting as if they do have any interesting things they would like to automate in the process. The worst case scenario is the contact says, no, we can't do. You may as well get in contact and ask us and see if it's something we can help with. Because you never know.

Dave: Absolutely. Well, we've come to this end of, well, I think you'll agree, it's been a fascinating podcast about really how we can move process forward, particularly if you're trying to look at moving, I guess your level of analysts. If you're still wedded to. maybe legacy systems where you're using so lab-based, you can move these to plants very, very easily. I think it's going to move the process forward hugely. Well, to discover more about how Metrohm can help your business really innovate right across its process, please do visit www.metrohm.com. I'd like to thank Paul for joining us today. Until the next time, it's a goodbye from me and goodbye from Paul.

Thanks for listening and tune in soon for another PII podcast. or visit piimag.com to subscribe to our bi-monthly magazine or weekly newsletter and read the latest news from the industry.