Yokogawa Debunks
Yokogawa Debunks
Calibration or Validation pt 1
Continuing on the topic of trusting your measurements, we brought back our Product Manager for Analytical Products, Nick Crowe, to discuss whether to calibrate or validate your analyzers. In this episode, Nick demystifies the differences between both options, and uncovers the truth about performing either. Download and Listen!
[00:00:11.030] - Sean
This is a Yokogawa Australia and New Zealand podcast. Welcome back to Yokogawa debunks, conversations with industry experts to uncover the truth behind myths and misconceptions surrounding the industrial automation and industrial instrumentation space. I'm your host, Sean Cahill, and thank you for joining us today. Now today we welcome Nick Crowe, product manager for analysers at Yokogawa Australia and New Zealand. Our regular listeners may remember that Nick joined us on season one of Yokogawa debunks to discuss some of the misconceptions around liquid analysers.
[00:00:45.850] - Sean
There was a follow up to that discussion, and through several requests we received through debunks@yokogawa.com contact address, today we're going to be tackling the question of calibrate or validated, which is better? So without further ado, welcome again, Nick.
[00:01:03.490] - Nick
Thanks, Sean. Thanks for having me back, and it's nice to know that the previous subject was of interest to listeners. And yeah, we've managed to expand on that with another great debate.
[00:01:15.580] - Sean
I look, you know, we get multiple questions from our listeners, but you know, this one that we're addressing today seems to be a particular favourite around the calibrate or validate area. So let's dive straight into the first question that we've received on this topic. Now, it may seem like a bit of an obvious one, but what is the difference between calibration and validation?
[00:01:37.420] - Nick
I suppose to some of us, it seems obvious, but to others it doesn't, you know, and it's a question that we regularly get asked here at Yokogawa. Yeah. What's the difference between the two?
Well, I'll try and sum it up quite simply here. A Calibration is something that we would perform if we wanted to ensure that our instrument is accurate and that can be an analyzer or any instrument, really. And when we do that calibration, the process is, we compare the instrument to a standard of a known value and if it's not correct, we can adjust the instrument to match and you might do this at various points through the instruments range to ensure you've got accuracy in that instrument right from your 4 to 20 milliamp range.
A validation, on the other hand, is a check that you might do just to determine if your instrument is working and functioning as it's intended and in there, you might have your instrument in the process and you'll flood that instrument with, or if it's an analyzer, you'll flood that analyzer with a standard or something of a known value.
And you want to check that your instrument responds that it can see that value and that it functions as intended, and it doesn't usually involve any adjustment of the instrument. You're just purely trying to see if the instrument actually responds to a change. It’s pretty brief, but I hope it gives a good starting point for our conversation today.
[00:03:03.910] - Sean
It does, but I guess based upon that assessment, which one would you say is the best to perform.
[00:03:10.870] - Nick
Well, and that's, I think, really what drives the question that people ask us, what should I be doing? Should I be calibrating or validating? Well, they both are very useful, but they both serve a different purpose and you'd use them both at different times and in different applications as well. So, a validation is something that you, you use to determine if your instrument is functioning as intended. So, you might do a validation and the result of that would perhaps determine if you need to progress onto a calibration so you might use it only to determine if one is required.
[00:03:46.150] - Nick
And the advantage of validation is it can often be performed online with almost no disruption to your instrument, and your normal measurement. And really, it's just a really very useful tool to prove that your instrument is working. Calibrations, on the other hand, you would use these to ensure the accuracy of your instrument, so it might make up part of a standard operating procedure where perhaps scheduled calibration is dictated, you know, might be in the water industry, for example.
[00:04:15.490] - Nick
Different people have different frequencies, maybe weekly calibrations, some have monthly calibrations and so on. Or you might use it in an industry where it's critical that you have proof of your product. Industries such as food and beverage or pharmaceutical, where you have to show that you're your product was produced within a tolerable range. But on the other hand, maybe you're working in an organization that has to report emissions to the environmental authorities. In those kinds of applications. You're going to be required to provide frequent evidence that your measurement is correct, so you're going to have to do calibrations there on a frequent basis.
[00:04:51.250] - Nick
So validations, if you don't have to have any of those criteria, you could use a validation and then use it to determine, say, if you need to do a calibration. But you know, there's other applications and industries where a calibration will be critical and important to do.
[00:05:07.710] - Sean
So you mentioned there the use of standards for performing a calibration. But if I was looking to do a simple validation. Do I need to use a standard or could I just use, for example, demineralised water or even tap water?
[00:05:23.410] - Nick
Yeah, I suppose that's one of the advantages of validation. In some situations, you might be able to get away with using tap water or potable water or something, and you can use it just to do a sanity check on your instrument. So, if we considered perhaps the chemical industry, for example, and a pH analyzer. It is quite likely that the process that your pH sensor is measuring is going to be far away from the value of tap water. So, we know that tap water is likely to be somewhere between 6.5 and 7.5 pH.
[00:05:56.260] - Nick
Maybe your process is operating at around pH 3 or perhaps right up at pH 9 or 10 or something like that. If you were to use tap water there and your analyzer responded to approximately pH 7, then you know that your analyser is working and responding reasonably well, what you've done there is validated that your analyzer is okay. You wouldn't be able to be sure of the accuracy of it, but you know that it's functional. And then if you have any concerns over the accuracy, perhaps you could use a known standard for your validation or again, progress onto that doing a full calibration.
[00:06:31.250] - Sean
OK, so both procedures are best performed with some sort of known standard. But surely this will require the technician to retract the sensor from the process or take it out of the process and do the check right next to the process or even take it to a lab. But is there any way of doing these checks, validations, and calibrations in situ, so that in the process where perhaps I don't want to remove the sensor because of safety concerns or any other reason that I'm pretty safe?
[00:07:00.590] - Nick
Well that’s actually a very good question and a valid point that you make there. It's not always practical to be able to remove your sensor from the process. I suppose with a lot of the Liquid Analyzer measurements, as we discussed in our last episodes, we try and encourage people to design their installation in such a way that they can remove the sensor from the process. But there's always going to be occasions where that hasn't been able to be achieved, so perhaps we have to try and think of something a little bit more creative to do a validation there.
[00:07:34.280] - Nick
And in those examples, we might consider using the washing function on an analyzer. We can turn on our washing function and it's going to cause a deflection in the measured values. After the washing cycle is finished, the analyzer should return to that previous process value. And then again, what we've done is validated that the analyser is working. Some analysers will have built in diagnostics and they'll give you some advice on the result and give you a pass or fail or some kind of warning message if there is no response.
[00:08:04.730] - Nick
And if it doesn't return to its previous value or close to its previous value. In those situations where you really can't pull your sensor out of the process. We can do things like that. And then, you know, with gas analysers, it's perhaps even less easy to pull your sensor out of the process. So, gas analysers are slightly different.
[00:08:26.210] - Nick
Often, they are installed on an extractive analyzer system, in which case you can isolate the measurement sensor and do a calibration or validation there, but others still are installed directly into the process. So, for these, what we have to do is we have to flood an area of the gas measurement with a known standard. And again, we monitor to see that we're caused a deflection and is the deflection as we expected it to be. Is the analyzer responding correctly and we can validate that way.
[00:08:59.330] - Nick
So, if we used a very common gas analyzer measurement is oxygen in combustion control. So, we're probably going to be measuring oxygen at around 4 or 5 percent. So really, you could just use instrument air or ambient air as your check gas because we know it's got 21 percent or approximately 21 percent O2 contained within. So we flood the tip of the sensor with ambient air and we expect to see the analyzer respond appropriately and assuming it does. We validated our measurement and we don't need to remove the sensor from the hot gas and we can carry on about our business.
[00:09:37.700] - Sean
Okay, that makes sense. You know, one of the thing look, we've previously spoken about liquid analysers as you were talking about a moment ago, but let's say, for example, with a new pH probe. Does that come with a calibration curve already from the factory?
[00:09:52.970] - Nick
It does come with a calibration curve or a slope, as we might refer to it when it's left manufacturing, but you're still required to do a calibration when you pair it with its converter.
[00:10:03.350] - Nick
So the manufacturing guys are going to build the sensor to within an acceptable tolerance. However, as I say, when when it arrives on site, the converter doesn't know the the in-depth details of that sensor. So, when you pair it with the converter, you still have to go through a calibration procedure. And that's when we go into the the analyzer, and we have to do a full two point calibration. And that way we can lock into the converter the exact output details of that sensor, its zero point and its slope, its speed of response, all those kind of things.
[00:10:38.040] - Nick
In more recent years, we've seen people starting to take up their digital sensors that come onto the market, and some of these do come with factory calibrations and they don't require that traditional pairing with the converter. All you have to do is set up the sensor address on on occasions, and the sensor itself stores all that zero slope and response information. And really, the converter almost becomes just an HMI or just a 4-20 mA output from that sensor
[00:11:12.510] - Sean
Certainly, simplifies the process. Definitely. Now, this next one might seem like a bit of a silly question, but if you calibrate them, does it mean that if there's drift in the response, you can somehow correct it?
[00:11:24.270] - Nick
Yeah, I mean, it's not a silly question at all. The calibration is the correction, so a calibration doesn't last forever. And I think we've spoken before, you know, process analysers generally have the measuring element in contact with the process, and the process therefore has an influence over the characteristics of that sensor. So, we can do a calibration and that analyser will be correct at that point in time. We put it back into the process and then the process starts to have its influence and the harsher that process is, usually the quicker it's going to cause that sensor to change its characteristics.
[00:12:02.190] - Nick
And really, that's what causes the drift. So periodically, to maintain accuracy. We still have to go through and do the two-point calibration and we can correct that drift anywhere within the range. We can also, you know, if we don't have time to do the two-point calibration, we can do a single-point calibration. However, it's not quite as good when you do a single point calibration. All you're really doing is correcting the zero point. When you expand to do the second calibration point, that's when you count for the slope or the curve of the output of the sensor as well.
[00:12:37.080] - Sean
So, Nick now one of the things that we mentioned a few times during the discussion is standards for calibration. Standards themselves. Where do they come from? and how would you know which ones to use?
[00:12:48.840] - Nick
That's actually quite an in-depth subject, all in itself. So, I'll try and keep it brief here as I can. And again, really, it will differ depending on the type of analysers that you're going to use and the reporting requirement that you have. And again, whether you're going to be performing validations or calibrations. So, we'll go back to using the humble pH analyzer as the example again. And for the calibration of these, we use what's called a buffer.
[00:13:20.200] - Nick
And if we consider a gas analyzer, then we use a standard or a standard gas. So, you know, some people might want to know what's the difference between a standard and a buffer? Well, a buffer is a chemical solution, and it's been designed to have the capacity to absorb a small amount of change without affecting its actual value. Whereas a standard or a standard gas for a gas analyzer is a solution with an exact predetermined value, and it can't withstand any contamination or dilution without changing its value.
[00:13:54.670] - Nick
So really, that that's a brief description of the difference between the two.
[00:13:59.680] - Sean
There's a lot to it to look into that with the difference between the standards and buffers. But look, unfortunately, we've actually run out of time today, and we've managed to smash quite a few misconceptions around calibration and validation. But it's also clear to me there's so much more to this topic that we haven't yet touched on. So, Nick, we'd love to invite you back for a continuation of this discussion?
[00:14:21.520] - Nick
Yeah, I think, yeah, just that brief discussion here is uncovered. A few more things we need to talk about. So, yeah, I'd be glad to come and join you again.
[00:14:29.560] - Sean
And thanks to you, our listeners, for joining us once again. And if you've got any questions or particular topics, you'd like us to discuss, please contact us on debunks@au.yokogawa.com. Also, if you enjoyed today's episode, please remember to like it and share it in your social media channels. We look forward to welcoming you back for future discussions. But in the meantime, stay safe and remember, Yokogawa debunks.