Much Smarter Manufacturing

Much Smarter Manufacturing

HOREXS is one of the famous IC substrate pcb manfuacturer in CHINA,Almost of the pcb are using for IC package/testing,IC assembly.

Smart manufacturing is undergoing some fundamental changes as more sensors are integrated across fabs to generate more usable data, and as AI/ML systems are deployed to sift through that data and identify patterns and anomalies more quickly.

The concept of smart manufacturing — also referred to as Industrie 4.0 in Europe, for the fourth industrial revolution — emerged from the World Economic Forum meeting in 2016 as a high-level goal for improving manufacturing efficiency through the use of new technology and better communication. Since then it has reached well beyond that initial concept, incorporating a growing portion of the semiconductor supply chain and adding more consistency and insights into manufacturing.

The semiconductor industry has been pushing in this direction even before there was a label. Over the past few years, equipment makers have been adding more sensors into equipment and various manufacturing processes. Chipmakers also have begun adding feedback loops from sensors in the field back to the manufacturing process to be able to pinpoint problems earlier in the manufacturing flow.

In the past, this largely was confined to manufacturing, and to a less extent the EDA tools that enabled design for manufacturing. But those changes now are starting to filter out into the rest of the supply chain for several reasons:

If data can be structured and integrated earlier in the design-through-manufacturing flow, it can be used to drive quality improvements across the supply chain. This is essential for safety-critical applications such as automotive, robotics, and aerospace, but it also helps reduce field failures and recalls, which are costly in terms of dollars and brand reputation.

More sensors are being added into every aspect of manufacturing, and AI/machine learning systems are being developed to sift through that data and identify problems or patterns earlier in the manufacturing cycle. That can help pinpoint problem areas at a granular level, theoretically all the way down to parts per quadrillion.

The chip industry is disaggregating as more chipmakers opt for packaged solutions rather than putting everything on a single SoC, and as more end markets demand customized solutions. That requires more integration of data from around the globe to ensure quality.

“The major change we’re seeing is the development of more integrated systems,” said Antoine Dupret, research director and fellow at Leti. “It’s not just about the ICs themselves. You have to look at the substrate and the SoC or SiP (system-in-package) design. And then, when you put those on an interposer, you have physical issues and process issues. Within a package, you need to make all the chips work together, so you have thermal and electrical coupling issues. And you need to make sure the interposer and the package were designed properly, and whether the communication between chips is correct.”


In a multi-chip implementation, this requires a deep understanding of the characteristics of each chip, so everyone involved needs to understand how it will be used.

“This isn’t as simple as putting together LEGOs,” said Dupret. “We are working on a project for the EU to define an integrated architecture (FET) where you can use different design tools on different continents, but still have a common definition of processing and communication. That way you will be able to put all of this together without redesigning everything.”

DARPA has proposed a similar approach with its CHIPS program in the United States, using a chiplet model for reducing cost, complexity, and the time it takes to develop multi-chip solutions. But the challenge in all of these approaches is making sure everything works as expected with third-party IP, and that requires consistency in characterization, an understanding of how chiplets or other IP will be used and what it will be next to, and how it will be packaged. In addition, all of this needs to be developed securely and with the same metrics for reliability.

“Today, a lot of fabs are horizontally and vertically integrated,” said Sujeet Chand, senior vice president and CTO of Rockwell Automation, in a presentation at SEMI’s Global Smart Manufacturing Conference. “There is information sharing across the fab, and there is vertical integration from OT into the IoT environment. The next evolution of connected enterprise requires you to take the fab that is vertically and horizontally integrated, and connect it with your suppliers, with the tool builders, the supply chain, and other third parties who can add value to what goes on inside the fab. So essentially we’re talking about opening up the data channels from inside a fab to reach outside in order to bring more value to the fab production. We’re able to do this today with standards and technology, but we were not able to do it in the past.”

The common threads here are ubiquitous, standardized connectivity and end-to-end digitization, which can be simulated both within a company and across the supply chain.

“Simulation has been around for a very long time,” said Chand. “What’s changed is that today we can take physics-based simulation, which in the past would run on supercomputers, and bring that physics-based simulation into a real-time environment using reduced-order models and more computing power that’s available in the OT (operational technology) environment. So we can created digital twins with sophisticated simulation, and we can mix together digital threads to provide value using the connectivity that we get in manufacturing.”

Some of this is already in use by chipmakers. Coventor, a Lam Research Company, has been developing virtual wafers to allow engineering teams to test equipment settings with more variation than they would be able to do in a physical production environment.

“Statistical analysis can provide greater confidence in the choice of process settings,” said David Fried, vice president of computational products at Lam. “Defects and random variations can be modeled in a virtual fab in a way that’s not possible in a real fab, letting developers test the sensitivity of the device structures to the unpredictable aspects of processing.”

What works, what doesn’t

For companies that have successfully implemented this scheme, the ROI has been significant. The problem is this is a mammoth undertaking, and it requires a clear vision and implementation blueprint from a high level down to the individual processes within a company.

“Everybody recognizes smart manufacturing is important to competitiveness,” said Benjamin Dollar, principal in the Global Supply Chain practice at Deloitte Consulting, noting that in a survey of companies, 83% said it will transform the way products are made in five years. “We’ve got a lot of technology in place today, but we’re not necessarily seeing the gains we’d like to see. There is a question of value, and that is overwhelmingly what we hear from our clients. Companies are struggling with capturing value at the same pace as technology progression.”

He said the main roadblocks for successful implementations are understanding the overall solution and getting employees to buy into the scheme. “You need to start with a strategic vision,” said Dollar. “That’s great, but it has to be implemented locally — at plants, at facilities, at factory cells.”

Schneider Electric started down this path several years ago, according to Stephane Piat, the company’s senior vice president of global supply chain and strategy. It focused on six digital accelerators — source, make, deliver, plan, care, innovate — and five supply chain models — collaborative, lean, agile, project-driven, and fully flexible.

“The technology is not free,” Piat said. “We make sure that any time we implement the technology in any of our sites we get ROI.”

Since 2017, it has ramped its smart manufacturing blueprint to include more than 100 smart factories and distribution centers, and has extended that out to include suppliers and customers. Piat said typical payback is 1.5 to 2 years, and that Schneider Electric has seen 30% energy savings, 30% CapEx reduction, 28% OpEx reduction, and 22% increase in equipment availability. It also has reduced quality issues by 15% and seen a 20% reduction in its CO2 footprint.

Not all results are so positive. Nevertheless, the capability to achieve those results is growing as the amount of processing power increases, and as more data is made available.

“With cloud computing, you can make this data available across the enterprise for collaboration, for greater visibility, for more sophisticated simulation,” said Chand. “And you can weave together digital threads, whether it’s on the supply side or on the design side. It brings a whole level of new capability to fabs. As we build these connected enterprises, no one company can do it alone. We need to leverage an ecosystem, and this will bring a step change as to how manufacturing brings value in the future.”

Traceability and communication issues

A key part in improving reliability involves traceability and better communication, and this can get very complicated.

“You start out with a wafer into a wafer fab, and then you build layers on top of that and put microchips on that,” said Michel Janus, a manufacturing digitalization researcher at Bosch. “While they’re on the wafer, to identify which chip we’re talking about, in which database or process step, it simply is sufficient to know which wafer you’re talking about and the coordinates of the chip. But as soon as the wafer gets [diced] and shipped off to assembly, you have to keep the knowledge about where on the wafer each individual chip was, and which chips get paired together into one package. Then, when they go from assembly to the back end, where they are being final quality tested and tested for functionality, and then shipped off to the customer — which could be an electrical control unit — we want to ensure we can identify a single chip. When a customer comes to us with a problem, we know back to the original wafer which part of the wafer we are talking about.”

From there the chipmaker can drill down to whether it was a process problem, what caused the problem, and use that data to improve yield in the future.

But communicating that out to the rest of the company, or the supply chain, is not a solved problem. It’s better than it was, but it’s still not perfect.

“Everyone wants data out of manufacturing, but it still tends to be a hodgepodge of communication protocols,” said Tom Salmon, vice president of collaborative technology platforms at SEMI. “This isn’t just about machine-to-machine communication. It’s also sensor communication protocols. There are more and more sensors, and everyone wants sensor fusion, but the communications protocols off the sensors need to converge. SEMI is looking at how to work on those.”

Devil in the details

There are new issues cropping up, as well, around sensors. In chip manufacturing, it’s becoming critical at each new node to sense and address factors that were considered minor in the past. This can include everything from nanometer-sized particles left over from etching or cleaning to a slight increase in humidity in a chamber or on a wafer. Adding more sensors and intelligence to detect any fluctuations and properly characterize them is critical to yield. This is easier said than done, however.

“Photoresists in the mask shop can absorb water very easily due their hygroscopic nature,” said Vidya Vijay, product manager at CyberOptics. “They can absorb anything that comes in contact with it. In the wafer fabrication area, there are spinners that spray on the wafer surface, resulting in condensation of vapor across the entire wafer. In photolithography, water can be absorbed into the photoresists. This causes problems because the patterns you want on the wafer don’t stick to them. And there is water in the vacuum equipment, such as a cryopump, it can become very slow.”

Any liquid can change the characterization, notably from the formation of time-dependent haze on a reticle. “The entire process can become more random and less predictable,” Vijay said. “If you have water vapor on the wafer, that can cause many failures in the ICs. Also, when the wafers are stored in polymer cases, humidity can cause a haze effect. That can cause light scattering, which increases over time. This is common in reticles.”

No alt text provided for this image

Fig. 1: The impact of humidity at advanced nodes. Source: CyberOptics

This has been an issue for some time, but as more process steps are added, the complexity of the problem rises, too. For example, there can be blind spots in the tools because certain areas are not accessible. Where homegrown fixes exist, sometimes they are not complete, such as monitoring humidity inside a FOUP, which can create higher humidity on one part of a wafer than another.

Humidity is just one factor. Vijay said it’s also necessary to measure vibration across three axes, as well as inclination across two axes.

Conclusion

For companies that have mastered smart manufacturing, the benefits are measurable and impressive. But there are still a number of issues to solve, including a continued mistrust of who is allowed to view data and in what form, and how that data needs to be secured even among trusted partners.

“One of the things we’ve seen with the pandemic is people are willing to share data about remote diagnostics, monitoring, and training,” said SEMI’s Salmon. “But when it comes to sharing other data, they’re far more cautious.”(Article is from Internet)

要查看或添加评论,请登录

社区洞察

其他会员也浏览了