The data historian on our system appears to have data compression facilities that have been haphazardly implemented. I am used to seeing data collection of each tag being defined with items such as update time, deadbanding of the signal, smoothing of the signal etc. What I want for a small number of tags that I am investigating to have as high a definition as possible for the duration of my investigation, i.e. the minimum possible update time with the data point being read at each update with no deadbanding and no smoothing or interpolating of data. So far I have not been able to find out how to do this.
The symptom that I have is when I look at the tag data on a HMI screen, the data has high definition, but when it is historised, straight lines extending over long periods between data points are evident.
Any suggestions?
Regards
Jeff Richards
In reply to James Beall:
In reply to jeff richards:
Andre Dicaire
In reply to Andre Dicaire:
In reply to Casey Houchens: