How does History Collection work together with Data Compression, e.g. what value is used to be compared with to decide whether to collect a value or not and is it the sampling time that decides how often the comparison should be done or...
Here's an old paper on data compression. The "The first-order predictor (FOP)" looks most like what I remember from old PI (OSIsoft) training, and might be close to what DeltaV uses.
Your sampling rate sets how often the historian grabs a snapshot. Compression determines whether the snapshot is stored or tossed out. The idea is - you can sample frequently, but consume less memory (hard drive storage) by only storing points that exceed your "compression" (deviation) setting.
In my opinion, compression is less important since hard drives are so cheap and large. YMMV.
There is also some information on compression in BOL.
In reply to John Rezabek:
Andre Dicaire