As we all well know, there are multiple mission critical applications in today’s “Age of Smart,” that are calling for zero DPPM (defective parts per million) in semiconductors and electronic systems.
In industries such as automotive, medical, aerospace, and more, where lives are at stake, defective parts are not an option.
The quality imperative
However, with the ever-growing complexity of technologies and supply chains, achieving zero DPPM seems more than ever to be an ambitious, unattainable goal. While improving test quality can decrease faults to a point, this alone will not move the needle.
Namely, we are still left with the need to:
- Maximize product fault detection;
- Accelerate process issue detection;
- And even predict and prevent faults as based on previous failures.
Yet, the current testing approach and infrastructures are, regrettably, prohibitive.
For zero DPPM we would need a whole new kind of quality and reliability testing infrastructure, one that can leverage data for delivering real-time and actionable insights to any stakeholders across the supply chain, for quick and effective remediation and prevention.
The 3 obstacles
Today’s infrastructures cannot deliver these capabilities due to three main obstacles:
- Data is contained in silos
- There is no data sharing
- There is no one unified analytics infrastructure
Let’s take a closer look at each of these:
Product and process related test data is contained today in siloed, disparate systems across the organization, making effective aggregation, cross-referencing, and analysis impossible. The data is contained not only in different systems across the organization, but across the entire supply chain itself, in many different organizations.
Moreover, this data is varied and complex, including parametric test data, chip and board genealogy, repair/rework, MES process data, and information about RMAs. Effective quality and reliability testing would require taking all of this data, from every possible data source, which – to date – has been impossible.
No data sharing
Since the multiple stakeholders across the supply chain do not share their data it isn’t possible to cross-reference results to detect more subtle failures that could have resulted from cross-process events. For, often, a fault or defect may arise not necessarily from one discrete component or stand-alone process. Rather, quite often the root cause remains hidden, as it is the result of a combination of factors, components, or processes that cross the manufacturing life-cycle and supply chain.
No one unified analytics infrastructure
Even if data sharing was in place, along with the ability to pool all these volumes of data from within organizations and from all the stakeholders, there would still be a need for one cohesive and coherent analytics mechanism to extract valuable, timely, and actionable insights.
Moreover, the conclusions gathered from the analytics would need to be delivered only to those stakeholders who own the process around the issue at hand. This way, only they would need to get involved and may streamline and accelerate corrective action.
Clearly, what we see here is that zero DPPM for the semiconductors and electronic systems supply chain requires a whole new quality paradigm.
The new quality paradigm
While all this may seem to be insurmountable, I am very happy to report that today there is a new paradigm that takes us closer to zero DPPM. Namely, when leveraging Quality Protection as a Service (“QPaaS”), all stakeholders across the supply chain benefit.
So, what is QPaaS? In effect, it’s a cloud-based data hub that is managed by a trusted 3rd party, and which collects data from both semiconductor original component manufacturers as well as from the electronics OEMs.
The hub then crunches all the data, applying advanced analytics and machine learning, and delivers findings directly only to the relevant stakeholder.
This hub breaks down the data silos, enabling stakeholders to share data, and performs the analytics to achieve zero DPPM.
The natural question on the reader’s mind might now be: “Data sharing? How will competing entities ever agree to share the data? This amounts to IP infringement, no?” Well, actually – no. Let me explain. The QPaaS data hub ensures IP protection since the data that is aggregated is never exposed to any of the other parties.
Rather, only those who need to resolve a detected issue will receive and have access to that piece of information. Moreover, the hub directly handles all activities relating to security and traceability.
Indeed, the benefits of this new paradigm brings great promise for achieving the previously unimaginable zero DPPM for semiconductors and electronics systems manufacturing.
To make the impossible possible, and learn more about how you too can achieve zero DPPM, I invite you to reach out to me via LinkedIn and start the conversation