In incorporating AI-based systems, enterprises must first determine the real purpose of adopting them and should noy use them merely as a tool for data presentation, otherwise the systems may become a "disaster" for backend data users, according to Kevin Tu, senior director at Macronix International.
Tu made the remarks when sharing Macronix's experiences in incorporating AI systems into its memory fabs at a seminar held alongside the just-concluded TPCA Show in Taipei. More than 20 years ago, the company already utilized Super Nova production management system developed in house based on an AI concept to enhance process control and plant management.
Data visualization is just the first step of any AI-enabled system, and what really counts is data utilization, Tu said, adding that the system must possess a decision-making capability to help IT engineers work out smart decisions. If the system is used for data presentation only, it will remain a traditional IT system, unable to show the real value of an AI system.
Tu reasoned that data generated by a fab is now measured in terms of TB, and it would be a "disaster" for engineers who need to sort out critical part from the big data if the AI system lacks decision-making capability.
After data becomes visualized, Tu continued, enterprises must narrow the scope of data by classifying the data in accordance with their importance, and then engineers can determine whether to access the data or not before locating useful data.
At the seminar, Macronix president JK Chen also noted that his firm's 12-inch fab with a monthly capacity of 20,000 wafers can collect over two billion pieces of data a day, and the data volume will grow exponentially to over 10 billion pieces for an 12-inch fab with a monthly capacity of 100,000-120,000 wafers. With this, he said, how to convert the big data into useful ones will pose the largest challenge to fab managers.