• Not Answered

Improving Distillation Column Performance

Wally Baker EmersonOne of the things we like to show users is how smart our smart transmitters can be. Sure, they provide diagnostic and status information, but users don’t always realize how much they can discern about their process by examining this info, like how to predict problems before they occur.

Wally Baker makes an excellent case in point in a short article in the Daily News from the AFPM Operations and Process Technology Summit 2017 in October. His article, Improving Distillation Column Performance, can be found on page 8, and talks about one of the trickiest things to control with a distillation column: flooding.

The primary problem in a distillation or stripping process is column flooding where the amount of liquid is too great, impairing free flow in both directions. When a solid layer of liquid forms in the bottom or in a tray, vapor must bubble through it. Some bubbling is normal, but if the liquid is deep enough, separation largely stops. Such a formation can potentially lead to the column filling with frothy liquid, which can be sent out the top without proper separation.

This is a problem for most automation systems since they usually don’t have enough information to optimize the process:

The automation system controlling the distillation column must modulate process parameters including feedrate, column pressure, reboiler heat duty, reflux ratio, various temperatures, etc. In many cases, however, not all desired measurements are available to optimize these controls, although it is possible to add instrumentation to identify key challenges.

Wally makes the point that using a 3051S to measure differential pressure across one or multiple trays can identify when flooding has happened, but if you dig a little deeper into the data the transmitter is collecting, it can tell you when flooding is beginning to form:

The pressure transmitter measures the differential pressure very rapidly, often as fast as 20 times per second, but the analog 4-20 mA signal is usually sampled by the control system at much slower rates, often once per second or slower. It becomes more challenging to infer an actual process change from what may simply be high-frequency noise, so developing problems can be easily overlooked.

 

OK, so how does all that extra data tell us anything? Wally explains:

Some pressure transmitters can calculate this statistical process information and report changes to the automation system digitally, spotting a meaningful change against the background clutter. When enough data has been collected from these specific DP instruments over periods of effective operation, a change in the standard deviation of the pressure can be reported and tracked as an initial indicator of incipient flooding. When identified early, corrective measures can be taken before more serious problems develop.

So when the standard deviation of the measurement begins to increase, even though the measurement has not changed, a flooding incident is imminent. If you don’t take any action now, in another 10 to 30 minutes, you’ll probably find yourself in a full-blown flooding incident as the trays fill and chemical separation begins to fall off.

All this benefit from what we normally consider to be a “secondary” variable. When you have the right kinds of smart instrumentation, such as a 3051S, its native intelligence can go a long way to better control.

You can find more information like this, and meet with other people looking at the same kinds of situations in the Emerson Exchange 365 community. It’s a place where you can communicate and exchange information with experts and peers in all sorts of industries around the world. Look for the Pressure Group and other specialty areas for suggestions and answers.

1 Reply

  • Even slower than in the DCS would be to do analytics in the cloud although some analytics lend itself well to cloud or to a DCS compute or embedded controller. So this application is yet another good example of how for very fast analytics you have to do it in the transmitter, as close to the sensor as possible. That is, there are layers of analytics. Real-time analytics in some cases has to be done in the transmitter and in some cases can be done in a computer on-prem, while historical "Big Data" analytics can be done in the cloud. There are indeed layers of analytics, and you have to pick the best place to do your analytics based on what you are analyzing. Also, analytics percolate up through these layers. You may do bearing vibration analytics in the transmitter close to the sensor, but in the high layer you do analytics for the pump as a whole a.s.o. Learn more what other plants are doing from this essay: www.linkedin.com/.../