Reading time ( words)

Solder is a marvelous material for joining metal parts together at relatively low temperatures. The first use of solder to join metals (mostly for adornments and some simple tools) came on the heels of the discovery of tin in Britain some 4,000 years ago, according to technology historians. The base metal and later simple alloy’s utility found for it an ever-expanding roll in civilization and industry for items as disparate as joining water piping, sealing automotive radiators, and making stained-glass windows. However, its crowning role for most of the last century has been for joining electrical and electronic elements of electronic products, from simple spliced wires to the most advanced chips and chip packages of the present day.
For most of its historical use in electronics, the solder alloy of choice was tin-lead, either an Sn60/Pb40 alloy or the Sn63/ Pb37 eutectic version of the tin-lead alloy. These two alloys were the workhorses of the industry. They were both well understood in terms of their processing and reliability—that is, until the advent of lead-free, a well-meaning but ill-conceived and poorly executed conversion, forced on the industry by the European Union in 2006.
While the purveyors of prospective lead-free solder solutions asserted that they had everything under control, nothing could have been further from reality. After the rollout of the first high-temperature SAC alloys, the industry quickly found out how vulnerable their production lines and products really were. More than $100 billion has since been spent trying to find the equivalent of the tried, tested, and trusted tin-lead alloys.
One of the most frustrating things about the forced conversion was that the stated reasons for risk to all human health were massively overstated.
To read this entire article, which appeared in the February 2020 issue of SMT007 Magazine, click here.