Data Sharing in the Age of Machine Learning Demands a Digital Thread

by Michael Schuldenfrei | April 22, 2018

Electronics, and the components that power them, are more complex and advanced than ever. As these technologies become a part of our everyday lives, their quality is more mission-critical than ever. At the pace which the demand for components is increasing, there’s no time for quality to lag behind under pressures to meet demands for quantities. This is a sure-fire way to guarantee a recall at the end of the digital supply chain.

According to data from NHTSA, from 2007 to 2016, the automobile industry experienced this very issue when car recalls due to electronics increased threefold. To assure that quality keeps pace with the complexity and sheer number of today’s electronics, a new holistic approach must be taken across the supply chain: building a digital thread with machine learning and IoT analytics.

The digital thread is a technique used to track the genealogy and data of a product from each component to the end-product. As a critical component of any manufacturing supply chain, it is only a matter of time before it becomes standard operating procedure.

The fundamental element of a digital thread is that data is shared. That includes sharing data inside a company, as well as with every company along the supply chain. For electronics manufacturers, that could mean data from each component’s fabrication and test phases, through assembly, inspection and rework and finally to usage data from the field.

In today’s competitive environment, most companies won’t even entertain the idea of opening the doors into their proprietary data. They cite many concerns such as exposing IP to potential competitors, releasing data that can be leveraged against involved parties and liability risks.

Nevertheless, taking part in a digital thread can have significant benefits:

Lower RMA costs – through board-to-chip correlations, faster root cause analysis, running on-line RMA prevention rules, reducing “No-Trouble-Found” (NTF) rates and in the worst case, implementing targeted recalls.

Improved quality and time-to-quality – by reducing time to reach acceptable DPPM goals for new products, creating an on-line quality link between chips and boards and using advanced failure prediction techniques such as escape prevention and outlier detection.

More efficient test processes – via Adaptive Test, where you can use component data to test “suspect” parts more and “perfect” parts less.

Better system performance – by avoiding in-spec chips with marginal performance and pairing the right chips with the right board.

So, can companies overcome the challenges and break down the silos across the supply chain? The answer is “Yes”: via a third-party hub. This entity sits between the parties, enabling analytics while hiding the “raw” data from other parties in the supply chain. Only insights derived from the data are sent from the hub to the different parties. The hub ultimately provides insight into the entire length of the supply chain to track down where issues stem from – issues that may have not been discovered until the end of the supply chain.

The hub makes heavy use of machine learning to overcome the inherent difficulty of handling the multi-dimensional complexity and huge volumes of data that make up the digital thread. It can therefore provide suppliers and customers insights which reduce time-to-quality and time-to-market for new products and quickly identify the root-cause of complex issues. It can even enable targeted recalls of faulty systems before they fail.

Meeting the demands and DPPM thresholds for technologies that power high quality electronics is essential and changing what it means to be competitive. As technology advances and avoiding electronics failures and recalls grows as a primary business objective, data sharing will become common practice across the value chain. Manufacturers will need to show that they’re doing everything to ensure the best product possible.