Charm RTD Offsets

Guys

The RTD charm control modules at a customer site are configured. -200 to 850 DegC for the XD scale and 0 to 150 DegC for the PV scale. They are configured Direct Independent and the FIELD_VAL (Not FIELD_VAL_PCT). This seams to work nicely with the following exception.

All the AI modules show between 0.4 to 1.0 degrees high across the range.

Has anybody come across this before?

Steve

  • First question is how are you determining the offset? Are you sure that the other measurement is accurate?
  • In reply to Bruce Brandt:

    Hi Bruce

    In parallel with asking this question I have asked the site team to recheck the accuracy of the RTD simulator.

    Steve

  • In reply to Steve Linehan:

    Steve,

    You do recognize that the 1 degree represents less than 0.1% of span which is probably the limit of accuracy on the module.

    Bruce

  • In reply to Steve Linehan:

    Are you using RTDs on such a scale? (-200 to 850 DegC).

    Anyway, we had the same issue on our system, using thermocouples. The calibrations were done against 3 different probes, so we were able to define an offset for different ranges. Lets say between -200 - 0 an offset X, between 0-500 an offset Y and between 500 - 800 an offset Z.

    What you need then is to manipulate the incoming raw data already, like that you don`t have to change anything on faceplates, etc.

    It is a solution, but we never did it.

    The issue with such a software "calibration" is:

    1. repeatability (over the time you have to be sure that the offset is still right)

    2. qualification (if you are within GMP, it is doable, but introduces more work to do)

    We were doing these test calibrations over 6 months to get a rule for sure. During 6 months, 6 calibrations were done. These offsets were clearly definable for 4 cases, but in 2 cases we would have worse values with the software fine-tuning as without.

    The final decision was to not implement such a solution, as it introduces also a safety/quality risk in the system, if the offsets are wrongly set or only forgotten.

    The solution in our case was to change the User Requirements Specifications.

    In some cases the requirements cannot be achieved because of the design, equipments limitations (your case), etc.

    I hope I could help with this experience.

  • In reply to Adam Bagosi:

    So I'm all wet with my previous comment. Looking at books on line the module should have a 25 degree reference accuracy of 0.25 deg C and you're seeing more than that. Again, I'd be concerned that the device you're using to test may not be calibrated to that level of accuracy unless your client is using NIST calibrated equipment.

  • In reply to Bruce Brandt:

    Additionally, I forgot to mention that you can search for a better result also on the hardware side. We exhanged our thermocouples to see if we can obtain better results. Then we changed to a better type of thermocouples. Then we changed the URS.

    In your case I would cross-check the requirements against the sensor specification. The vendor of the sensor describes how accurate their sensor can be.

  • Bruce

    The calibration team just reported back in that with a different simulator the accuracy was fine.

    Guess some contractor calibration certs cannot be taken at face value.

    Steve

  • In reply to Steve Linehan:

    Not hugely surprising though I would certainly be making a point back to the contractor that his equipment is our of spec.