Home News Chip testing is getting harder and harder

Chip testing is getting harder and harder

2026-04-01

Share this article :

Many outside the semiconductor industry are curious about how humans consistently manufacture transistors only tens of nanometers in size, day after day, while maintaining consistency across different process equipment, production lines, and wafer fabs. One way to achieve this is through device-to-device matching (TTTM). 

However, as the chips produced by wafer fabs become increasingly complex, and feature sizes and process windows shrink, TTTM becomes increasingly challenging.

A wafer may undergo 600 to 800 processes within three months, so the equipment must produce consistent results. Systems used to measure and test these results must meet extremely high standards.

"The latest technology nodes require hundreds of closely related process steps, including multiple patterning, high dielectric constant/metal gates, complex etch chemistry, selective deposition, buried power rails, and more. Throughout the manufacturing process, every tiny process defect can accumulate into a compounding effect that impacts yield," said PeiFen Teh, Director of Application Engineering at Onto Innovation. "Therefore, adopting the TTTM specification at every critical process step is crucial to ensuring process stability across the entire production line."

Shorter product lifecycles, faster yield ramp-up rates, and diversified supply chains also present challenges to tool matching operations. "Due to the fragmented supply chain and diverse product range, we need to ensure completely consistent test results, making tool matching increasingly important and challenging," said Eli Roth, Smart Manufacturing Product Manager at Teradyne. "We need greater transparency on more complex devices. Tighter protection and advanced packaging technologies integrating more chips require higher device repeatability, which puts more pressure on testing to minimize sources of error. Furthermore, faster yield ramp-up rates mean less time to stabilize the New Product Introduction (NPI) base before going into production."

Tool matching

Tool matching (also known as chamber matching) ensures the consistency of outputs, such as the consistency of outputs between different Automated Test Equipment (ATE) of the same model. There are various methods to achieve this, but it begins by using standard wafers traceable to the National Institute of Standards and Technology (NIST) to verify the accuracy of different measurement metrics (such as critical dimensions (CD)). Then, the tools are matched by adjusting hardware settings until the critical outputs are consistent. For advanced process nodes, data-driven machine learning models simulate complex nonlinear deviations between tools. The foundry then repeats these steps for other tools.

Sometimes the best-performing tool is used. "Gold tools or test vehicles are widely used. We like to use a vehicle known to perform well as a reference and then statistically compare the performance of other vehicles in the fleet to that reference vehicle," Ross said. Furthermore, quantifying the variability of the measurement system itself is also crucial.

Tool matching is not a "one-and-done" process. In fact, the more advanced the process, the more frequently tool matching is performed. However, in some cases, tool matching is absolutely necessary:

  • During tool installation/validation;

  • When introducing new products or processes;

  • After completing corrective or preventative maintenance procedures;

  • After replacing instruments or components;

  • Perform them regularly, such as once a day, once per shift, or multiple times (advanced stages).

Strengthen data sharing

To meet the needs of leading device manufacturers, enhanced data sharing is required. "While benchmark tool matching using manufacturer-provided data is necessary, device manufacturers are now demanding deeper matching at critical process steps to ensure consistent device performance. Achieving this level of matching requires access to fab-level device data, such as metrology and functional test results," said Melvin Lee Wei Heng, Director of Application Engineering at Onto Innovation. "Combining this device-specific information with tool-level data is critical for verifying that tools are operating within the process 'optimal point' and achieving consistent performance across the entire production line."

"We use a large number of VLSI (Very Large Scale Integration) products traceable to NIST standards for step height and linewidth measurements. But beyond system calibration, we also match optics to ensure that the lighting settings don't change as the process flow moves from one machine to another, and that the optics and system lighting conditions remain consistent," said Andrew Lopez, Application Engineer at Bruker. For example, using standard wafers, engineers can adjust tools such as calipers or sensors to tight tolerances. "We test linearity at multiple different step heights and linewidths to ensure the system is sensitive enough to detect deviations that occur during the process."

While the two are related, tool matching is not the same as tool fingerprinting or acquiring the "features" of a tool. Every tool in a wafer fab - scanners, etchers, cleaners, testers, optical inspection systems, etc. - has its own microscopic defects or wear marks on the parts it processes. Therefore, even when performing the same process flow, the same system will exhibit slightly different performance. By acquiring and analyzing these features, engineers can achieve consistency in performance across different tools. [Editor's note: A subsequent article will discuss process tool matching.]

Introducing machine learning models may or may not improve fingerprint recognition performance. "Traditional fingerprinting methods rely on pre-designed features, control charts, and threshold-based comparisons. These methods work well with low-dimensional and predictable variability," said Vincent Chu, Senior Consulting Manager for Advantest Test Cloud Solutions. "However, today's testers collect far more data than ever before - high-resolution parameters, waveform features, time-series measurements, and continuous telemetry data. In these high-dimensional spaces, machine learning models can capture the subtle non-linear behavior that defines the true 'features' of how tools actually operate. This allows us to represent the baseline of tester behavior more accurately and scalably, without relying entirely on predefined metrics."

In metrology, as in the testing field, precision and accuracy are important metrics. Accuracy refers to how close a measured value is to its true value. Accuracy can be improved by comparing the measured value to measurements of known standards, such as standard wafers with multiple features, but this is difficult to achieve.

"Of course, we want to ensure that every measurement result is labeled 'accurate,' but that's nearly impossible. We usually only pursue precision, and as we accumulate experience and consistently achieve this goal, we eventually get a good yield," said Chris Mack, co-founder and CTO of Fralilia. "So we call it 'accurate,' but it's not a truly 'accurate' value in the sense of conforming to NIST standard measurement results. Nevertheless, precision remains the most important characteristic of our measurement tools."

Precision is determined by measuring the same feature multiple times and recording the variability around the central value.

How tool matching works

Matching metrics depend on the tool. For example, in acoustic microscopy imaging, metrics include image intensity, signal amplitude, depth response, and defect detectability. "We use Long-Term Stability Monitoring (LTSM), or global tool matching. It uses known or reference samples and software algorithms to compensate for any system-to-system differences by normalizing the acoustic image response, ensuring consistent test results across different tools and locations," says Bryan Schackmuth, Acoustic Microscopy Imaging (AMI) Product Line Manager at Nordson. "LTSM allows for image normalization, resulting in matched images without manual operator adjustments. This global matching procedure is typically run each time the operating frequency is changed (e.g., when changing a transducer) or before the start of each shift or day."

The correlation between metrological measurement results and electrical test results is increasingly strong. "Tool matching is typically based on a series of steps, and the specific approach may vary slightly from fab to fab or original equipment manufacturer (OEM)," says Joe Fillion, Director of Product Management at Onto Innovation. "The first step is to perform fingerprint or configuration comparisons. The tools need to be as compatible as possible in terms of software and hardware - same software version, lens, aperture, light source, MFC, etc. Once the compatibility is reached to a reasonable level, the tools typically perform standard automated testing or calibration procedures to ensure consistent performance across different tools. If the results are consistent and meet the expected specifications, a standard verification run is performed to measure the actual performance on the wafer. These results will include a target value as well as an upper and lower limit to ensure operation within acceptable ranges."

Ms. Teh from Onto provided a detailed step-by-step guide for tool matching. She stated, "We first ensure the performance of each tool component is consistent, and then monitor the matching performance of the entire tool group." She also mentioned:

Component-level calibration: Monitor system health check parameters and apply calibration when specifications are exceeded;

System-level calibration: Check the spectral response of a set of tools measured on a standard wafer;

Spectral calibration: Used to improve fleet matching levels; and Parameter result monitoring: This process uses a standard wafer (measuring critical dimensions, thickness, or material constants). Recalibration can be performed to optimize tool matching levels for each parameter.

Regarding test equipment, engineers need to closely monitor component drift. "Thermal sensors drift over time," says Ross of Teradyne. "There's a time skew. We typically control drift through periodic calibration and reference checks, and we continuously compare the equipment to a reference scanner to understand the range of deviation and when periodic calibration is needed. SPC monitoring and big data monitoring are other testing methods. Just like a hammer is always looking for a nail, we may need to perform periodic calculations."

Depending on the tool configuration, tool-level calibration is sometimes built-in. "Our test equipment is based on high-precision resistors, so it uses a self-verifying approach to ensure the correctness of every measurement. This is how we verify that each tool is calibrated and that measurements are consistent across different test equipment," says Jesse Ko, COO of Modus Test.

Electrical testing and metrology often complement each other. "Foundries have introduced in-line electrical/functional testing to ensure that test equipment performance is at a level that won't affect the device," says Heng of Onto. "In certain critical process steps, cross-sectional analysis is performed to ensure that the resulting profile meets the device specifications of the sensitive layer, and traditional metrology methods cannot meet these requirements."

Intel's "Complete Copy" Strategy

Another approach to this problem is to start with the results, rather than the tools, and then work backward. Intel's "full replication" strategy did exactly that, completely replicating everything in the foundry - equipment, methods, and processes - yet the results differed. Ultimately, the company narrowed the differences down to environmental factors like humidity. Calibrating the equipment may only be the first step in a complex investigation.

"It's the same model, the same calibration," said Jon Holt, Global Fab Application Solutions Manager at PDF Solutions. "You need to ensure the accuracy of your measurements, because that's another potential source of discrepancies between two sites. Measuring at the same location or using the same tools is one approach. But then you need to consider environmental factors more comprehensively and holistically. Are your cooling water, gas supply, and gas distribution settings the same? All of this information is crucial. The real challenge ultimately lies in functionality. Is the component functioning as expected? Does the device have the expected current, breakdown voltage, gain, or speed? I can't simply plug in a forward error correction (FEC) tool and match all the sensor outputs to make the chambers matched. I wish things were that simple."

As the industry moves towards fully automated operations, tool matching is likely to become more tightly intertwined with the production process. "It will likely shift from periodic calibration-driven activities to continuous, data-driven monitoring systems," Roth said. "We won't need to flip through reference cards repeatedly anymore; instead, we'll have continuous automated monitoring with tags and alarms - a more advanced version of what we're currently doing."

It's worth noting that not long ago, it wasn't possible to match individual CD-SEMs on-site. "We didn't initially plan to release a product to improve tool matching," said Mack of Fralilia. "But we found that by measuring the errors in CD-SEMs and removing them from the metrological results, we could obtain more accurate metrological results, and this strategy naturally improved tool matching."

The next step in the development of CD-SEMs is controlling randomness. "As tolerances across all CDs continue to shrink, tool matching between CD-SEMs becomes extremely difficult. Furthermore, tool matching is needed for randomness such as linewidth roughness, line edge roughness, or CD uniformity. This is something we've never done before, so we can say we're groping our way forward."

As signal-to-noise ratios become increasingly difficult to maintain, metrology is turning to machine learning. "As feature sizes shrink, it becomes increasingly difficult to match what needs to be measured," explained Teh of Onto. "We expect the spectral sensitivity of sub-nanometer parameters to approach the noise floor of metrological tools. Some very small parameters will be masked by more sensitive parameters. In this case, machine learning models can be used to amplify key signals." "Besides enhancing the signal, machine learning models also play a crucial role in managing tool fingerprints. They can effectively record and identify changes made to the tool. These operations can correlate these changes with tool performance (hardware, software, and wafer results), providing a deeper understanding of causality. Once a certain level of trust is established, the next step is to automate decision-making.

"Machine learning evolves tool fingerprinting from manually defined statistics into learnable behavioral representations, which is particularly useful for advanced test systems handling the massive amounts of data generated in high-volume fleet operations," said Mr. Zhu from Advantest Test. "Machine learning can also enhance anomaly detection, which is critical in production testing. By learning the normal behavior patterns of specific test cells, models can identify early deviations caused by calibration offsets, component aging, environmental changes, or load plate effects - typically earlier and more reliably than static thresholds. In multi-tool fleets, machine learning can highlight cross-tester differences that may affect binning or correlations. Nevertheless, machine learning complements, rather than replaces, traditional statistical methods." 

In conclusion

Tool matching is not a new process for wafer fabs and test facilities, but its difficulty increases significantly with shrinking device sizes, increasing device complexity, shorter process windows, and stricter tolerance requirements. At the 2nm node, metrology systems are nearing their performance limits, making any improvement in signal-to-noise ratio invaluable.

For example, when engineers measure 3nm features, they require overlay accuracy less than 0.3nm. To achieve this, engineers now need to model the stochastic effects of line edge roughness, linewidth roughness, and CD uniformity to match different CD-SEM tools.

Engineers typically begin by comparing tool features to match tools at the component level. From component to system to parameter calibration, the matching process becomes more sophisticated and automated with the help of machine learning. To achieve finer tool matching in metrology, engineers need access to the wafer fab's electrical test data. This ensures the matching results the wafer fab truly wants - high-yield, high-performance chips every time. Tool matching plays a crucial role in ensuring high yields, especially at state-of-the-art device nodes.

Source: Compiled from semiengineering

 

 

View more at EASELINK

HOT NEWS

Glass substrates, transformed overnight

TTTM,transistors,semiconductor,industry,semiconductor,Thermal,sensors,sensors

In August 2024, a seemingly ordinary personnel change caused a stir in the semiconductor industry. Dr. Gang Duan, a longtime Intel chi...

2025-08-22

UFS 4.1 standard is commercially available, and industry giants respond positively

The formulation of the UFS 4.1 standard may accelerate the implementation of large-capacity storage such as QLC

2025-01-17

Amazon halts development of a chip

Amazon has stopped developing its Inferentia AI chip and is instead focusing on semiconductors for training AI models, an area the com...

2024-12-10

DRAM prices plummet, and the future of storage is uncertain

The DRAM market will see a notable price decline in the first quarter of 2025, with the PC, server, and GPU VRAM segments expe...

2025-01-06

US invests $75 million to support glass substrates

US invests $75 million to support glass substrates. In the last few days of the Biden administration in the United States, it has been...

2024-12-12

SOT-MRAM, Chinese companies achieve key breakthrough

SOT-MRAM (spin-orbit moment magnetic random access memory), with its nanosecond write speed and unlimited erase and write times, is a...

2024-12-30

TSMC's 2nm leak: Inside story revealed

TSMC has entered a "one-man showdown" in advanced processes, with 2nm undergoing unprecedented investment and expansion. Etching is a c...

2025-09-08

Japan semiconductor equipment corp

The uncertainty of the Chinese market, which purchases more than 40% of Japan's semiconductor equipment, has become a hindrance to the ri...

2024-10-24

Address: 73 Upper Paya Lebar Road #06-01CCentro Bianco Singapore

TTTM,transistors,semiconductor,industry,semiconductor,Thermal,sensors,sensors TTTM,transistors,semiconductor,industry,semiconductor,Thermal,sensors,sensors
TTTM,transistors,semiconductor,industry,semiconductor,Thermal,sensors,sensors
Copyright © 2023 EASELINK. All rights reserved. Website Map
×

Send request/ Leave your message

Please leave your message here and we will reply to you as soon as possible. Thank you for your support.

send
×

RECYCLE Electronic Components

Sell us your Excess here. We buy ICs, Transistors, Diodes, Capacitors, Connectors, Military&Commercial Electronic components.

BOM File
TTTM,transistors,semiconductor,industry,semiconductor,Thermal,sensors,sensors
send

Leave Your Message

Send