Filter and Time Constant

Call me dense, but I'm having trouble understanding how the time constant on a filter block works.

I am trying to implement a rate of change alarm to warn operators when a certain reaction might begin to get out of control, before it is too late.  The signal was way too noisy and would have given a lot of false alarms at first, but I added a filter block to smooth it out and have been watching the trend.  How do I determine the correct time constant?  I am trying to identify (alarm) a 1 degC/min increase in temperature or greater.  The STDEV for the AI$OUT appears to be around 0.11, while the range is 0 to 100 degC.  I want to smooth it out as much as possible without risking missing a real high rate.

Thanks

  • The filter time constant is the time for the filtered output in response to a single step change of the input to change by ~63% of the input change.  Alarms on rate-of-change are notoriously tricky, as I think you are discovering.  You don't say what amount of temperature change is significant nor how quickly you need the alarm to occur.  My guess (from the 1 degC per minute) is that the time frame for generating an alarm is a minute or two and that only increases are significant.  Temperature signals usually aren't too noisy, but the rate-of-change calculation greatly amplifies any noise there is.  Is the raw temperature signal noisy?  If so, could the problem be electrical interference of some kind?   Assuming the temperature signal isn't too noisy, this is what I would do:

    Firstly, apply a mild filter (say 5 seconds) to the raw temperature measurement.  Then calculate the instantaneous rate-of-change. Best way is to put the filtered temperature through a dead-time block and then subtract the delayed value from the filtered value; you can adjust the period of the delay by changing the dead time parameter.  Then use a CALC block to calculate the instantaneous rate-of-change, in degC/min - you will need to use the dead time parameter in this calculation.  Don't hard-code anything into the calculation - if you adjust the dead time, you need the calculation to stay correct.  I'd start with the dead time equal to the module's execution time, as long as this isn't shorter than one second.  Finally run the calculated rate-of-change through a filter block.  You now have three 'tuning' parameters: the raw temperature filter time, the dead time and the rate-of-change filter time.  You'll need to look at the trends, but I'd guess a rate-of-change filter time of 20-30 seconds might do the trick.  It's a trade-off between delaying the alarm and generating false alarms.  As human response to an alarm is rarely better than one minute, and can be a lot longer in busy times, I'd err on the side of over-filtering - too many false alarms and the operator will just ignore all alarms.

    I had a very similar problem many years ago on a polyethylene plant and got that to work just fine, but it did take some careful juggling of the three parameters.