How Do I Get Smart With IPC CFX? (Part 1)
Situational Analysis: 30 Years of Bad Experience
One of the original criticisms during the introduction of automation has been the inevitable loss of visibility and control of the operation. In SMT assembly, we see placement machines working faster than the eye can follow, placing materials that the human eye cannot see unaided, leaving human production operators and management completely dependent on their equipment and the data it produces. This is most unlike a human-driven production process where the operational flow can be continuously refined with Kaizen activities.
By contrast, high-speed machines seem pretty much fixed in the way that they operate, following instructions that they are given. However, this assumption is not true. There are many areas in which the operation of automation is compromised in terms of performance, driven by inefficient programs, poor planning, inadequate material logistics, and incompatibilities in the product design. In the past, Kaizen tools have been somewhat overlooked when dealing with automated assembly processes due to the lack of data and visibility of the machine operation. Though machines in the SMT operation do have interfaces to the outside world, these were created more with the intention of machine diagnostics rather than providing customer-driven MES value, leading to a situation where each machine has its own proprietary interface, and more significantly, proprietary data content.
For a manufacturing organization to make use of machine data, sophisticated software—and in many cases, hardware—is required to extract data and make it meaningful in context with other data gathered from machines of different vendors. Once developed, machine interfaces are liable to change as machine software is updated and enhanced by the vendor (often without consideration for who is using the data), resulting in the sudden unexpected failures of data collection, which can often be hidden for a while until incorrect or inconsistent data is noticed. Therefore, very significant effort has to be put into the support of machine communication to ensure that it works effectively and continuously. With such elevated costs and an otherwise high risk of failure, innovation in the use of data from automated processes has been severely restricted.
Thinking of data derived from manual processes, the issue becomes more severe as the latitude to record data in different ways varies from person to person, even where standard forms are concerned. It is not surprising to hear cases reported where 80% of issues in manufacturing come from 20% of manual operations. Thus, the application of data from across the shop floor is usable with human levels of interpretation, which is certainly not good enough for automated use by ERP or PLN, for example.
Challenge: The Next Generation Is Coming
Today's assembly factories are seeing the biggest challenge to face the industry in a generation, called by many the next industrial revolution. However, in essence, the challenge is a simple extrapolation of trends that have been occurring and increasing in assembly manufacturing for decades. As product lifecycles diminish, competition grows, and technology changes accelerate, and online visibility from customers increases, the potential cost and risk of holding stock of finished goods become an increasing burden on the business.
Depreciation in the value of material stock adds to the traditional costs of warehousing and logistics, especially where products are assembled remotely from their target market. The effect of this on the assembly factory has been to drive the need for flexibility to be more of an in-line process with delivery demand rather than working on a batch basis and relying on the finished goods warehouse to create the impression of flexibility while retaining batch production.
Therefore, losses in assembly manufacturing due to the lack of visibility and management of automated assembly processes is multiplied several-fold. Productivity levels are seen to reduce significantly as product mix increases and batch sizes reduce, even to the level of single-piece production, which is no longer experienced only in exceptional cases. The resultant serious threat to business success is driving the need to look for a new solution to get to another level of understanding and optimization in the operation of automated processes.
As a result, the creation of Industry 4.0 and the drive towards smart factories has been created to provide innovation for the creation of tools that assist and augment key management decisions and engineering projects, and to an increasing extent, perform decision-making automatically. Unfortunately, for this to happen, we also need a related step change in the quality, availability, and value generation ability of data obtained from automated machine processes to make it happen.
Though electronics assembly has been a focal point for machine communication in the assembly industry as a whole, due to the nature of the technology and sheer complexity of the bill of materials, the requirements for digitalization in other areas of assembly manufacturing is also developing. The need has arisen for consistency and completeness throughout the whole of assembly manufacturing, driven by the increasing amount of electromechanical devices and products. Automation in assembly outside of electronics could be thought of as being simpler and lacks historical issues around legacy bespoke interfaces, but it’s an area that is effectively starting from scratch in terms of advanced digital communication and needs to be included.
The Human Element
Factories are not fully automated through the use of assembly machines and robots and are not likely to be in the foreseeable future, as processes requiring humans will continue to be needed. Therefore, the inclusion of human operations is essential as part of the digitalized factory. For example, the digitalized factory could also use augmented reality to enable humans to be a key part of the operation, performing a far wider range of tasks than before with higher value, driven by a continuous feed of expert knowledge enabling participation in many key roles and enhancing their job satisfaction.
Values and Challenges of Digitalization Today
The concept of digitalization and Industry 4.0 offers manufacturers a compelling business solution where flexible factories are managed and optimized, achieving levels of productivity once enjoyed with higher volumes and simple product mix. However, the barrier of visibility and acquisition of data is a significant issue, holding back the majority of factories in the market from making the decision to go forward.
Machine vendors experience a great deal of frustration, as customers approach them to provide an “MES interface” tailored for their needs. Many customers have similar requests but with different defined requirements; thus, the development and support overhead borne by machine vendors has escalated rapidly in recent years. In many cases, requests relate to defined metrics that a particular operation are using because it is easier that way or the specific data points needed to calculate the metrics are not available through the legacy machine interface. The permutations seem infinite. For IT teams within manufacturing—as well as software solution providers that understand the power that direct machine connectivity can bring—there are so many different machine vendor solutions that need to be developed and supported in each factory, no matter what level of co-operation the machine vendors provide, that interfaces again become a very significant overhead.
Enter the IPC Connected Factory Exchange (CFX)
IPC is responsible for creating consensus-based standards for the PCB assembly industry with many leading companies across the industry working together to find common, optimum specifications. The CFX committee within IPC was formed to create a solution for the challenges faced by the digitalization of factories with the creation of a true industrial internet of things (IIoT) standard covering all aspects of digital factory data content and data communication. The intent was to create a standard that is truly “plug and play” with the inclusion of all hardware and software vendors, using a single communication standard without dependencies, licensing conditions, or fees. Nor would the solution be controlled by any business entity, but rather by IPC.
To be a “plug-and-play” IIoT standard, three main criteria need to be defined. A simple analogy is the use of a mobile phone. The hardware works according to a standard, enabling handsets from many different vendors to connect seamlessly. The signal that is used to connect the handsets is a part of a defined standard network, shared by all carriers, supported on all handsets. This means that any phone can connect via a call to any other phone across the entire planet, connecting everyone together. The conversion of the spoken voice into the digital signal that is transmitted across the network is the encoding method.
Every handset needs to know the standard way to encode, and at the other end of the conversation, decode, the digitized content back into an analog voice. As in this mobile phone example, almost all standards regard themselves as complete at this point. The remaining issue is that although anyone, such as someone in the USA, can call any person in China, if the person in China does not speak English and the American does not speak Chinese, then they cannot communicate effectively at all. The definition of content, essentially the language that is used, is a critical part of the definition of a true communication standard, and is the crucial element that differentiates CFX as a true IIoT standard compared to legacy data transfer mechanisms. Even new so-called IIoT solutions that simply act as a conduit of poorly defined data from one point to another.
Being a native IoT solution brings control over the data content and availability into the hands of the machine vendor who can provide assurance of performance and compliance. Add-on IIoT solutions that simply move data into the Cloud, for instance, are based on interpretation of data from sources that are liable to change at any time and carry all the same risks of legacy systems under the guise of new technology.
CFX Transport and Encoding
Figure 1: AMQP v1.0 data transfer options.
In the case of CFX, the transport mechanism is AMQP v1.0 (Figure 1). This is a secure data transportation standard created and widely used in the finance industry for monetary transactions. Therefore, it is a very secure, globally proven and accepted, and reliable mechanism. Data messages can be encrypted where CFX is used to transfer sensitive data outside of the company domain.
AMQP v1.0 allows for two methods of communication, both of which are utilized by CFX. The first is a simple mechanism for anyone on the CFX network, referred to as an “endpoint,” to send a message to a central point, known as a “host” or “broker.” The host will then provide all of the work to distribute messages to any subscribers of the message. Thus, the endpoint creating the data does not need to care about which messages are being used or how many other endpoints need to receive the messages; further, it does not need to deal with the various confirmations required. To handle all of this would unnecessarily tax the computing power within the machine for this type of broadcast message, which could affect machine operation. Many free-of-charge hosts for AMQP v1.0 can be found on the internet that work with CFX.
In addition to the broadcast message type, there is also a requirement in manufacturing for direct point-to-point communication of very large or critical data. For example, this can be the transfer of a large 3D inspection image to another endpoint that may process the image to determine defect trends or for traceability purposes. Other applications may include closed-loop machine-to-machine analytics that drive continuous improvement of the performance of the line. AMQP v1.0 was selected for CFX as a modern, secure transportation mechanism that can support both of these modes of communication. The host may be Cloud-based, site-based, or a combination of each.
In addition to the publishing of data, each endpoint on the CFX network—such as an assembly machine—also has the ability to receive data from other machines and endpoints performing transactional events in the factory, such as material logistics. CFX is completely omnidirectional. In this way, every machine is given the opportunity to extract data from anywhere in the manufacturing environment, irrespective of the vendor, with which to improve and optimize their operation.
The JavaScript Object Notation (JSON) lightweight data-interchange format was selected as the mechanism in which to encode data into CFX messages. JSON is a modern data exchange standard widely used by many software systems, as a next generation of the older XML data exchange format.
CFX Messages and Content
The design intent of CFX is to allow it to be used for many disparate types of automated factory machines and processes, manual processes, and transactional events. Virtually all data standard attempts in the past took an approach to define content that appears logical at first glance. One considers the varied types of machines that exist and then maps out data content definitions by the type of machine, which seems reasonable. However, it resulted in undesirable outcomes in reality due to the nature of modern machines and systems in a factory.
There is entirely too much functional redundancy across machines; therefore, those standards became filled with redundant content. For example, many machines have internal transfer conveyors and work zones; stand-alone conveyors have these too, and many have barcode scanners for ID acquisition. Thus, these old approaches required re-implementation of that identical functionality or capability in myriad machines. Worse, the implementations often had different flavors for the same capability, leading to extreme difficulty in data interpretation by any data consumer that connected to those varied machines.
Finally, there was a failing beyond which the old approach could never advance, and that involves custom machines. Any approach based on machine type would require going to committee to define a custom-created machine workcell. These are quite common and relegated such systems back to custom integration. Fortunately, there is an entirely different and superior way to approach data content that suffers from none of these problems.
CFX took a novel and revolutionary approach to data content by looking at machines, not as monolithic entities but rather collections of base capabilities that could be common to many machines. But those capabilities are defined in the content standard only once, and then reused with perfect consistency. This concept of information building blocks based on the lowest level capability of a system or machine, or data topics, was created that represents individual functions and characteristics of manufacturing machines and processes. Through the combination of messages from different topics, the complete operational functionality of any machine or process—current or future—is modelled, including entirely custom machines of which there are only single examples in existence.
Figure 2: Topics as message building blocks.
In Figure 2, each topic is shown as a different color building block, assembling the relevant blocks together on the left of the diagram that match events that occur within the machine process describes the operation of each unique machine. Applications on the right side of the diagram can then decide which messages they should subscribe to based on their scope.
In operation, each defined CFX endpoint will create a series of messages belonging to the defined topics that describe their operation. Figure 3 shows an example of the message transfer for an illustrative endpoint.
Figure 3: Examples of CFX message transfer.
Four of the main “Level 1” topics are color-coded in Figure 3. Beneath each Level 1 topic is a series of Level 2 topics, which provide finer detail about events. Level 3 topics also exist beneath Level 2, which focus on specific technical areas of technology. The main topics are as follows:
- Root: Messages used to establish communication and discovery of endpoints together with their capabilities, described in terms of supported topics
- Production: Messages used to convey an action or value directly related to or has affected a unit of production; this includes product flow control, recipe management, machine setup, poka-yoke control (a Japanese term describing the operational hand-shake used to avoid the possibility of incorrect operation occurring), closed-loop data analysis, assembly actions (materials and tools used) as well as support for test data and result acquisition
- Resource Performance: Details and timings of performance events, operational state changes, and related data; note that CFX provides data with which performance metrics can be calculated and does not transfer pre-defined metrics or reports
- Materials: Information about materials as they are received at the factory, including transportation, verification, carriers, etc., as well as material transport orders
- Information: Work order definition and scheduling, quality management, data, and validation of production units
- Sensor: Identification of production units and acquisition of measurements, including energy consumption, temperature, and humidity
This article was originally presented at the Technical Proceedings of SMTA International 2018. In Part 2, Ford will address CFX adoption, the values and challenges of digitalization with CFX, and the scope of CFX utilization.
Michael Ford is the senior director of emerging industry strategy at Aegis Software.