• Not Answered

Synchronizing the Batch Historian and Batch Executive Servers

I'm just having an issue with my Batch Historian server.

I'm missing two months of batch history data and campaign history data.

I was able to recover the batch journal .evt files related to the batch history by following a KBA NK-1300-0254 "Synchronizing the Batch Historian and Batch Executive Servers".

I have recovered the CMJournal but I'm not able to update the campaign history data on the batch historian server.

Is there any KBA to follow or some other indication to update the historian data?

Thanks in advance.

6 Replies

  • Hello,

    I am in the same boat as yours. My Batch historian is missing 1 month of data . Also, the path specified for collecting the data , I am missing the entire folder.

    Does anyone has seen this issue?
  • Hello,

    I am in the same boat as yours. My Batch historian is missing 1 month of data . Also, the path specified for collecting the data , I am missing the entire folder.

    Does anyone has seen this issue?
  • In reply to Summer:

    You don't specify which version of DeltaV you are running but there was an issue in v14.3.1 that could result in what you are describing for the batch historian not collecting data. The fix (Workstation Hotfix 07) was released last week. It has special instruction for installation order where you need to install it on the Batch Historian station first before installing on any data sources of the historian. KBA NK-1900-0840 will have the link to the hotfix for v14.3.1.
  • In reply to Scott Thompson:

    Thank you for that information but we are on v 14.3 of DeltaV . I actually found on batch historian that the services were stopped. Enabled the services manually and able to see the data back. what about auto delete on batch historian. enabling auto delete will delete the data from SQL or only from batch history view?
  • In reply to Summer:

    Auto delete will completely delete the batch data from SQL. If you want to be able to continue to access the data you should have auto archive enabled. This will copy the data over to an archive database that is then detached when it reaches the target size. If you need to see the data again it can be re-attached and any reports targeted to the archive database as source.
  • In reply to Summer:

    I don't recommend enabling auto delete. I have not found auto archive to be helpful in older versions as we would typically want data archived by year and this was not an option. We typically end up making this a manual maintenance activity to ensure data is not lost and is groomed so clients can access it reasonably . What I would really like is the batch historian to act like OSIsoft Pi archives, where the data is not sloughed from file to file, but rather the pointer to the active db is shifted. I'd like it to shift when I schedule it (quarterly, yearly etc), provided I have sized the active db based on consumption rates. I'd also like a SOA or other client facing service that allows queries against all archives (focused by time of course), so clients don't have to figure out where the data is.