Improving combustion efficiency by incorporating CO monitoring18 May 2001
Significant improvements in accuracy of control and efficiency in a wide range of combustion situations can be achieved by monitoring carbon monoxide levels. The potential cost savings from combined O2 and CO monitoring are very significant, particularly in processes with varying load.
The widespread use of coal-fired boiler systems remains prevalent, particularly in Eastern Europe and other developing countries. The need to upgrade and refurbish existing power plants, not only to improve efficiency but also to reduce environmental pollution has never been greater. The measurement of carbon monoxide in the flue gas and its subsequent use in the on-line control of a boiler system, whilst already accepted in the West, is now being embraced elsewhere. CO monitoring of flue gases is not in itself a radically new idea, but the application of the available technology in improving efficiency is constantly evolving.
The enlargement of key economic communities such as the European Union is often conditional on potential entrants meeting stringent criteria. A safe, efficient and clean power generating system is one way in which to compete on equal terms, and provide a basis for entry. The introduction of large scale manufacturing plants into developing territories coupled with a rapid expansion in power generating capacity also drives the need for improvements in efficiency. This is the key to being competitive, as energy production represents such a large direct cost.
For many years the excess oxygen at combustion process outlets has been measured and used to control the combustion efficiency. This approach has been shown to be effective and has been widely adopted. It is based on the simple principle that if the outlet oxygen levels are low, then the fuel is not being fully combusted and is thus wasted. Conversely, if there is excess oxygen leaving the process, then exhaust heat losses are greater, leading to reduced efficiency. But the shortcomings of this approach for coal-fired power generation were established over 20 years ago.
Monitoring of carbon monoxide as an indicator of combustion efficiency was first considered by Anson et al1 in 1971 and by Ormerod and Read2 in 1979. The technique allowed much more accurate setting of optimal combustion and improved control in situations where load conditions vary. The potential for savings using combined oxygen and carbon monoxide measurement has been shown to be very significant. This is particularly relevant where measured oxygen levels may be affected by stratification effects or air ingress. Studies have indicated that reduction of excess O2 by as little as 0.5 per cent can yield savings, in fuel costs alone, of hundreds of thousands of dollars a year. In addition to economies in fuel consumption, CO control can realise equally valuable increases in power output both in response to climatic conditions and with reduced mill availability in coal-fired boilers.
A number of techniques has been developed for measuring carbon monoxide in combustion processes. These include in-situ infra-red absorption, extractive infra-red absorption and electrochemical detection. All of these have been made commercially available, with some more successful than others. The key requirements for effective control of the combustion process are, fast response, accuracy of the measurement and long term stability.
More recently, control of combustion using CO monitoring has proved valuable in helping to reduce NOx emissions. Using CO and O2 monitoring together allows the user to maximise efficiency whilst reducing excess air, thereby minimising NOx emissions.
Since the amount of excess air determines so many aspects of the combustion process, it is usual to measure the oxygen concentration in the flue gases and adjust this to the appropriate level. For example, a boiler operating at full load will require a lower excess air level than one at minimum load, whilst a very low excess air level will minimise NOx emissions. Oxygen analysers employing zirconia sensors are accurate, easy to use and relatively cheap. Nevertheless, the oxygen concentration can give an ambiguous measure of combustion efficiency. As well as the variation with boiler load, the optimum oxygen concentration will depend on the amount of air inleakage after the boiler and, for coal-fired applications, the amount of moisture in the fuel.
Figure 1 shows how stack losses vary with excess air and boiler load conditions. For any given load it can be seen that the combustion losses rise sharply when excess air falls below a certain level. This increase is due to the energy wasted by incomplete combustion of the fuel.
Losses rise steadily when excess air increases past the stoichiometric point, owing to waste heat carried out by the air.
The correlation between excess air and O2 concentration measured in the flue gas is indicated in Figure 2. Not surprisingly, these are linearly related. The stack losses are also shown, for low and high load conditions. From this it can be seen that even if the combustion can be exactly controlled at the stoichiometric point in all cases, the variation in O2 setpoint between high and low load (control range) is significant.
The relationship between CO and excess air is shown in Figure 3, again with stack losses superimposed. As the excess air increases past the level required for stoichiometric combustion, so the CO level becomes almost constant, at a low level. In contrast to O2 measurement, however, the point where the CO concentration begins to rise rapidly correlates strongly with the condition for minimum losses across the full range of boiler loads. The measured CO concentrations are primarily affected by the incomplete combustion of fuel, rising very rapidly when there is inadequate oxygen for complete combustion, and thus lack of excess air.
The effect of this relationship is that, even allowing for a range of excess air levels around the stoichiometric combustion point, there is little variation in CO setpoint and this is practically independent of load conditions. By adjusting the combustion air level so that the CO concentration lies just above the minimum level (typically 100 to 200 ppm) the boiler efficiency can be optimised. Even where considerations other than maximum efficiency are paramount, knowledge of the CO concentration can be an important indicator of the combustion conditions in the boiler.
Similar effects to those described for varying load can be observed when comparing O2 and CO setpoints for different fuel type and composition.
The optimum CO concentration is also virtually unaffected by air inleakage after the combustion zone, even if this is substantial, whereas oxygen concentration is significantly changed.
The process of deciding whether CO monitoring will prove of commercial benefit must include careful consideration of the combustion process. In general, the greatest savings can be made when the fuel is coal, oil or biofuel (wood products); gas-fired applications rarely justify this approach. The more the load changes, the greater the advantage that can be gained from using CO to control the combustion. Similarly, when fuel composition may be variable, using CO monitoring offers greater economic benefit. One pitfall of optimised control is the potential for build up of slag on the boiler tubes, which can have a detrimental effect on performance that can easily outweigh the benefits of fuel economy and reduced heat losses.
Controlled combustion operating at theoretical maximum efficiency minimises the excess air. This can lead to an increase in unburned carbon in the fly ash without increased CO levels, which in turn can cause excessive slagging. Therefore caution needs to be exercised in balancing the theoretically derived optimal parameters and the practical conditions.
In selecting a CO monitoring system, due consideration must be given to the response time required. This will be dictated by a combination of the distributed control system (DCS), and the methods used to adjust fuel and combustion air. Generally, the faster the response, the better. However a pragmatic view may be adopted taking into consideration all factors. It could be argued that CO monitor response time should match oxygen monitor response as closely as possible.
The response of a CO monitor is dictated by a number of parameters: process lag; sample transport; and measuring device characteristics. The process lag is the time taken for the combustion exhaust gas to get from the combustion chamber exit to the sampling point. The sample transport delay is the time taken for the gas to get from the sampling point to the measuring device. The measuring device response is most commonly defined as the time taken for the output to show 90 per cent of the actual value, when presented with a step change in CO concentration.
An additional benefit of using carbon monoxide measurement to minimise excess air is the reduction of SO2 levels. This in turn leads to a reduction of SO3 producing a drastic decrease in acid corrosion.
In more recent times NOx emissions have increased in importance when considering combustion control. The introduction of low NOx burners, and in some countries the imposition of environmental "tariffs" have made this a consideration of commercial significance. In situations where load or fuel vary (as mentioned above), control of combustion using measurement of oxygen alone is likely to lead to situations where excess air is well above the level required for stoichiometric combustion. In such circumstances nitrogen and oxygen will combine to form NOx, leading to greater emission than would be the case using both CO and O2 measurement for control.
In addition to the above factors, when implementing any CO monitoring it is necessary to take into account how the process conditions will affect the measuring system. In this respect matters such as flue gas temperature, ambient conditions, dust burden, flue gas composition and ease of service must all be considered.
A number of technologies is available for measuring CO concentrations in stack gases. Each technique consists of two basic elements: the method for delivering the flue gas sample to the analyser and the physical principle used to make the measurement. In-situ probe and cross-stack instruments rely on the measuring device being sited where the gas can be analysed as it passes through the process (Figure 4). Extractive systems take a continuous sample stream of the flue gas and transport it to the measuring device.
Infrared instruments use the natural absorption characteristics of carbon monoxide molecules to determine the concentration. The amount of infra-red radiation, at specific spectral wavelengths, absorbed by the flue gas is continuously measured. This allows the concentration of CO to be determined.
The infrared wavelength is carefully chosen to ensure that other constituents in the gas have an insignificant effect on the measurement.
Electrochemical technology uses the voltage generated by the reaction of carbon monoxide with a balanced chemical cell to indicate concentration. The reaction is reversible by periodically exposing the cell to air, allowing continuous, long-term use. The chemical cell is designed to be inert to other flue gas constituents.
The technologies and their relative merits are summarised in Table 1.
Each of the available technologies is best suited to a specific range of applications. Cross-stack infrared (Figures 5 and 6) is undoubtedly best for combustion control where a fast response and low maintenance are important. It also avoids problems with sample integrity, which can arise whenever an extractive sampling system is used. A relatively inexpensive combustion control system can be arranged by combining in-situ CO and O2 analysers, thereby avoiding the capital and servicing costs of a sample handling system. The principal drawback of a cross-stack measurement is the difficulty in performing a span calibration, since it is not usually possible to fill the stack with test gas. This can be overcome to some extent by fitting a gas cell inside the analyser and filling this with a known concentration of gas. Extractive analysers become more attractive where CO measurement has to be combined with a range of environmental parameters such as NOx or SO2 since the expense of a sample extraction and conditioning system can be spread across a range of analysers, and regular calibration checks can easily be carried out. Electrochemical sensors provide a rugged, reliable and cost-effective solution, though many users favour non-dispersive infrared (NDIR) analysers, especially where high concentrations of acid gases are present since these can affect the lifetime of electrochemical sensors. NDIR analysis suffers from sensitivity to cross to H2O and CO2 which has to be corrected for. The in-situ probe infrared system falls somewhere between the cross-stack and the extractive systems and offers some advantages where a simple system with fast response is required.
Whilst the measurement of excess oxygen is the most common method for combustion control, this has serious shortcomings where load and process conditions are changing. The addition of CO monitoring offers significant advantages in such conditions.
When choosing to monitor and control using CO, it is important to take account of other operational factors. Selection of technology also needs to be carefully considered.
Combined monitoring of O2 and CO has been shown to offer optimal combustion control. This can lead to efficiency improvements that can save thousands of dollars a month, giving very fast payback on investment in instrumentation. There are also other benefits from optimising combustion this way, in particular related to improved environmental emissions performance.
TablesTable 1. Technologies for the measurement of carbon monoxide in industrial processes