CONNECT WITH US

University team develops explainable AI module

Bryan Chuang, Taipei; Adam Hwang, DIGITIMES Asia 0

Taiwanese researchers have developed an explainable AI module that can explain reasons behind results of facial recognition.

MOST Joint Research Center for AI Technology and All Vista Health Care at National Taiwan University (NTU), an AI research center established at NTU and sponsored by Ministry of Science and Technology (MOST), on May 11 unveiled its explainable AI module xCos.

Currently for AI-based recognition, data input and results are known, but the criteria for and process of judgment leading to the recognition results are unknown, said NTU professor Winston Hsu, leader of the xCos project sponsored by MOST.

xCos can explain reasons behind results of AI facial recognition and can help developers of AI-based recognition technologies inspect mechanisms of recognition systems to see whether the judgment is reasonable and thus improve the systems, Hsu noted.

xCos can be generally matched with AI-based facial recognition systems, Hsu said, adding the AI research center is also promoting application of xCos to AI-based decision making other than facial recognition. For example, when AI-based decision making predicts that a power plant will increase power generation in the next hour, xCos can explain that this because the weather conditions have changed or there will be festivities.

Hsu won first place at Disguised Faces in the Wild competition at a 2018 conference on Computer Vision and Pattern Recognition taking place in Salt Lake, Utah, with overall accuracy of over 90%.

Minister of Science and Technology Chen Liang-gee in disguise to test AI-based facial recognition

MOST minister Chen Liang-gee in disguise testing AI-based facial recognition
Photo: MOST