Commonality Analysis (CA) refers to a set of techniques to identify any systematic causes of yield loss. These techniques typically use association rules and/or ANOVA (Analysis of Variance) to identify manufacturing variables that are “common” to bad devices. These “data mining” techniques are prone to false positives and false negatives. Despite their weaknesses, these techniques are often used to narrow down the possibilities as a starting point for engineers to use ad-hoc analysis and experiments to identify and correct a problem.
Equipment Commonality Analysis is used extensively in wafer fabs to identify equipment, or periods of time, when equipment is systematically under performing. For IC design firms, Commonality Analysis can be used to identify photomasks (reticles), test programs, probe cards, equipment, time periods, or wafer test and assembly steps that are systematically under-performing.
There are many data availability and cleansing issues associated with Commonality Analysis. Accurate data of products traversing the manufacturing and testing environment must be available. This data is referred to as genealogy data (how lots or batches of material are split and combined) and history data(the tool associations) for a lot, wafer, die, or device. This data is often stored in multiple systems, including Manufacturing Execution Systems (MES), in engineering test programs, and may also be manually recorded, e.g. in spreadsheets. Genealogy and history data are used to identify associations common to underperforming material.
yieldWerx Enterprise provides the infrastructure and tools needed to accurately and efficiently conduct Commonality Analysis. Data stored in MES or other business systems may be loaded into the yieldWerx Enterprise database for analysis. This data, combined with test data from wafer sort, final test and fab parametric test data, enables product engineers to efficiently analyze, identify and act on underperforming material across manufacturing and testing environments.