We have three amazing kids. The youngest, Chloe, has been borrowing her big brothers’ tablets to watch movies. To avoid further conflicts, Santa decided to bring Chloe a tablet. Needless to say, she was ecstatic (and so were her brothers). Chloe is three, so this was a huge step in her independence as the little woman she is (Figure 1).
Chloe is a very observant girl. She has seen her brothers talk to Alexa, ask questions, and command instructions. So, one of the first things Chloe asked her own Alexa was: “Alexa, play something I like.” As a 3-year-old girl, Chloe doesn’t know anything about the thousands of hours that took Amazon’s team to build Alexa’s impressive artificial intelligence (AI) engine. She also doesn’t know anything about machine learning, algorithms, or neural networks.
What surprised me with this interaction was Chloe’s ability to transcend the machine-human interface to a point where she knew Alexa would understand her enough to play something she liked. Of course, in this case, Alexa was not able to respond and asked Chloe to be more specific with her request. Think about it: a generation that’s perfectly comfortable with machines knowing what they like and dislike.
It was 1913 when Henry Ford installed the first assembly line to manufacture its Model T. The assembly line was able to reduce the assembly time of the Model T from 12 hours to 2.5 hours (Figure 2). This was one of the very early successes in automation. Instead of manually moving parts and subassemblies of his cars from workstation to workstation, a conveyorized system would move the parts through the several stages of assembly. Instead of people, parts would move from point to point automatically.
Figure 2: Ford’s assembly line (1924).
Fast forward 107 years, and automation is ubiquitous. There are just a few industries left untouched by automation. And these industries, as a rule, use the lack of automation as a differentiator (think hand-made watches, shoes, and guitars). For all other applications, automation is a given. For this reason, when we talk about the factory of the future and discuss the next transformative steps in SMT manufacturing, it is a waste of time talking about automation. We need to see automation as a fait accompli. The fact that a machine is automated and can perform pre-programmed instructions is a given. That has been done, so what’s next? It may be the end of the automation era.
When we discuss Industry 4.0 and all the amazing things it will be able to do for us, what we really need to talk about is autonomous systems. Autonomous systems go far beyond automated systems. Let me give you an example based on something we do every day: autonomous X-ray machines. To start, an automated X-ray system will take a sample, inspect it, and—based on a set of programmed parameters—assess if the sample is pass or fail. In some cases, the system will require human intervention to help determine if the sample is indeed a pass or fail.
To this point, I did not describe anything new to you. This is the level of expectation from users today. The next level of expectation, however, relates to making the X-ray machine autonomous. That means that the user will no longer need to program the machine. The X-ray system will be able to determine if the board is good or bad based on its own evaluation of the board. It’s unnecessary to say that AI will play a major role in the execution of this vision.
It’s also a necessary component when machine learning will allow us to connect the collective experiences of each machine in the SMT line to create what we call “line conscience.” Line conscience is developed as interconnected machines from the solder printer to the X-ray system and will speak a single language to share experiences and ideas (Figure 3). Ultimately, line conscience will be able to deliver unprecedented levels of efficiency and efficacy. To the point, in this factory of the future, we’ll be able to produce boards with a lot size of one with perfect quality, but that’s the topic of a future column.
Figure 3: SMT line of the future as it develops “line conscience.”
Chloe was disappointed when Alexa was not able to play something she liked. As engineers of future technologies, it is our job to continue to push toward a future of autonomous systems, including machines that know what we want and don’t need to be programmed and assembly lines that learn from each other. Together, we can develop a line conscience that can be duplicated in digital twins.
We need to strive for a level of abstraction that will allow us the flexibility to customize complex assemblies in a simple way. We need to continue working so that one day, a 3-year old girl can ask Alexa to play something she likes, and Alexa will be able to respond. By the way, Alexa, the answer was “Wonder Pets.”
Dr. Bill Cardoso is CEO of Creative Electron.