Corrosion of water walls in fossil fueled boilers and gasifiers has traditionally been considered the result of gaseous corrodants such as H2S and HCl, reacting with the heat exchanger tube surfaces. Under reducing conditions these corrodants prevent the formation of a protective oxide scale, leading to increased metal loss. Recent field experience in boilers, using staged combustion systems have shown much greater corrosion rates than predicted by simple gas/solid corrosion processes. The presence of large quantities of unoxidized iron sulfide in deposits in areas where high corrosion rates were found suggests that deposits play a role as well. Subsequent laboratory corrosion studies found that the presence of FeS can indeed lead to very high corrosion rates, but only under oxidizing conditions. Since FeS usually deposits only where reducing conditions are present, the accelerated corrosion observed requires alternating reducing and oxidizing conditions. Such conditions may exist in areas in staged boilers where the overfire air mixes with substoichiometric fluegas from the burner zone. It will further be shown that chlorine corrosion may be caused or at least accelerated by chloride containing deposits in fossil fueled boilers instead of or in addition to corrosion due to HCl in the fluegas. Due to the high sulfur content of fossil fuels, chloride deposits can form only under reducing conditions. However, once formed they are highly corrosive under mildly reducing and oxidizing conditions. It is therefore likely that the basic chlorine corrosion mechanisms in fossil fueled boilers are the same as those in waste incinerators.
Fireside corrosion of carbon or low alloy steel waterwall tubes in fossil, mainly coal-fired boilers has been a minor but persistent problem. Recently the problem has become more severe in the United States due to the introduction of staged combustion systems, designed to lower NOx emissions. In these systems, the fuel is initially combusted at a stoichiometric ratio of 0.7-0.9. Additional air is added through overfire airports above the burner zone, so that the complete combustion is achieved at the furnace exit at the usual stoichiometric ratio of 1.2. Under these conditions corrosion rates in excess of 60 mils/yr (1.5 mm/yr) are frequently experienced over relatively large areas, especially in supercritical boilers. In subcritical boilers corrosion rates up to 60 mils/yr are also occasionally reported, although corrosion rates here are generally less than 40 mils/yr (1 mm/yr).
A survey of waterwall corrosion studies indicates that the observed losses are usually attributed to the presence of gaseous sulfur and/or chlorine species. According to most researchers1-4 the corrosion mechanism involved is simple sulfidation of the steel in the presence of reduced sulfur species, mainly H2S, in the fluegas. Thus, under reducing conditions iron sulfide or mixed iron sulfide/iron oxide scales are formed. These are less adherent and more prone to spalling than iron oxide scales, resulting in high corrosion rates. This theory holds up quite well in extremely reducing conditions, for instance in gasifiers. Here corrosion rates in H2S and CO rich environments found in isothermal laboratory studies correspond quite well with those found in actual service5. The corrosion of low alloy or carbon steel was found to be a function of the H2S partial pressure of the syngas. If one extrapolates the corrosion rates to the H2S levels usually found in boilers, 500-1500 ppm, the predicted corrosion rates are relatively low, generally less than 20 mils/yr (0.5 mm/yr). This finding has recently been confirmed by laboratory experiments carried out by Kung4. Based on his