Is it enough to define throughput as the number of units tested in an allotted time? When “good” later gets flagged as “questionable”, leading to the inevitable re-test or RMA, the utilization of equipment and data clearly hasn’t been maximized.

How efficient is the team overseeing operations if they are constantly sidetracked by the need to tweak test programs or identify test equipment issues? How confident are CapEx decisions when utilization measurements aren’t properly calculated? 

At Optimal+, we prefer to look at it all through the prism of Productivity – the benchmark for both the machines and the people charged with maximizing their efficiency.

Manufacturing intelligence to increase Productivity:

Bottom line impact

Optimal+ IIoT solutions generate up to a 20% improvement in operational efficiency and productivity in both semiconductor and electronics manufacturing operations as a result of our fully automated, intelligent test management solutions. 

Empowering your team

With Optimal+ driving IIoT connectivity, product engineers, test managers and other operations team members become more efficient in tracking down manufacturing issues that impact yield and quality.

 

Combining key metrics

Our IIoT product analytics solutions help you achieve greater equipment utilization, enhance your throughput with greater levels of efficiency and enable better use of your test fleet and global operations while improving your testing processes and products.

“With Optimal+ we were able to collect and analyze data concerning equipment utilization and implement corrective actions immediately”


Dr. Albert Wu

VP of Operations,
Marvell

Examples

Preventing Test Time Degradation
Finding the Issue
A high-level, portal-generated report shows that a certain product has higher test times than expected. The engineer uses detailed touchdown data in the portal to analyze the problem and find the root cause.

Performing the Analysis
The portal is used to view the test time over the duration of the entire lot and the engineer immediately discovers a constant increase in touchdown time.

Preventing Future Recurrences
A rule is created using the rules interface to generate an alert when similar conditions surface again.

Preventing_TT_Degradation

Touchdown data chart in OT-Portal showing the lengthening test time

 

img_2 Preventing Test Time Degradation

Reducing Wasteful Retests
Finding the Issue
The Optimal+ solution’s portal shows that while online retest rate is high for a product, the yield reclaimed by retesting the dice was very small. This looked like an opportunity to increase throughput or to pay less retest costs by eliminating wasteful retests (i.e. retest of dice with a low probability of becoming “good”.)

Performing the Analysis
The engineer uses the wafer map viewing capabilities of the solution’s portal to display a stacked composite wafer map containing the final results after all retests. The small yellow triangles highlight dice which were retested. As is clear from the picture, very few of the dice were reclaimed and 98% of the retest did not result in any bin recovery/bin switching. The retest policy is changed to retest only those bins with significant recovery rates.

Preventing Future Recurrences
Retest recovery rate rules are defined in the rules interface and are triggered whenever low retest recovery rates are achieved, enabling the user to fix the issue quickly and increase throughput.

retest2

 
img_2 Reducing Wasteful Retests

Comparing Test Time across a Fleet of Testers
Finding the Issue
A cross-entity rule triggers an alert when significant performance differences are detected between testers. A cross entity rule compares different entities (such as testers, probe cards and load boards) and highlights equipment with significantly poor performance.

Performing the Analysis
In this instance, the rule monitors the average good bin test time and flags slower testers that result in low throughput. The specific testers are checked and found to have incorrect settings which negatively impact their performance.

Preventing Future Recurrences
A rule is created within the solution to generate an alert when similar conditions resurface.

Comparing_Test_Time

img_2 Comparing Test Time across a Fleet of Testers