TWI Industrial Member Report Summary 757/2002
P Woollin
Background
Segregation of molybdenum, in particular, reduces the pitting corrosion resistance of highly alloyed stainless steel weld metals in comparison with parent steels. Consequently, a welding consumable is required with higher levels of elements contributing to corrosion resistance than the parent material, if preferential attack of weld metal is to be avoided. This leads typically to selection of nickel alloys with molybdenum levels in the range 9-15% for welding superaustenitic and sometimes superduplex stainless steels. These steels contain nitrogen at levels typically about 0.2 weight %. The most common nickel alloy with such a molybdenum addition, ie alloy 625 with 9% molybdenum, also contains 3.5% niobium, which has recently been considered to be detrimental because of its tendency to combine with nitrogen from the adjacent stainless steel, reducing corrosion resistance locally in the HAZ and forming niobium nitrides in the weld metal. Consequently, proprietary welding consumables based on alloy 625, but without niobium, have been developed. However, instances of lower than expected pitting corrosion resistance have been encountered industrially when using such filler metals.
Objectives
- To quantify the effect of niobium alloying level on corrosion resistance in ferric chloride solution of nickel alloy weld metals made with fillers of composition nominally equivalent to AWS ERNiCrMo-3, ie with 9%Mo.
- To quantify the effect of nitrogen-bearing TIG shielding gas on corrosion resistance of nickel weld metals made with AWS ERNiCrMo-3 wire and a Nb-free equivalent, in ferric chloride solution.