Academic Profile : Faculty
Asst Prof Fan Xiuyi
Assistant Professor of Digital Health, Lee Kong Chian School of Medicine
Assistant Professor, Lee Kong Chian School of Medicine
Email
Dr. Fan Xiuyi is an Assistant Professor at the Lee Kong Chian School of Medicine and the School of Computer Science and Engineering (joint appointment). He obtained his BSc and MSc degrees from the University of Utah in the United States and his PhD degree from Imperial College London in the United Kingdom. Before joining NTU, Dr. Fan worked at the University of Utah, the University of Sydney, Imperial College London, and Swansea University. His current research interest is in Explainable AI (XAI).
Dr. Fan has a strong track record in developing XAI theories and applications. Specifically, he has developed explainable decision-making theories and applications, and has pioneered work in developing formal semantics for explanations. He has explored ways to introduce argumentation-based explanations to machine learning and developed the first argumentation-based explainable planning framework. He has also studied probabilistic satisfiability for the purpose of developing a probabilistic logic-based explainable machine learning method and its applications. Recently, he has been working on comparing and formally evaluating explainable AI methods, building a generic XAI software toolkit for medical applications, as well as understanding situations in which predictions cannot be trusted using XAI.
Dr. Fan has published refereed papers at AI conferences and journals including AAMAS, IJCAI, AAAI, ICRA, ECAI, and AIJ. He was the recipient of the Welsh Crucible Leadership Programme Award, the Best Paper Award at IEEE ICSGAH, the first prize in the Health Innovation Technology Challenge, and the Rector’s Award Scholarship at Imperial College London. While working in the UK, he led projects using Explainable AI to Understand the Impacts of Non-pharmaceutical Control Measures on COVID-19 Transmission for Evidence-based Policy and Understanding the Impact of COVID-19 on the International Economy via Currency Markets using Explainable AI, both funded by the Welsh Government.
Dr. Fan has a strong track record in developing XAI theories and applications. Specifically, he has developed explainable decision-making theories and applications, and has pioneered work in developing formal semantics for explanations. He has explored ways to introduce argumentation-based explanations to machine learning and developed the first argumentation-based explainable planning framework. He has also studied probabilistic satisfiability for the purpose of developing a probabilistic logic-based explainable machine learning method and its applications. Recently, he has been working on comparing and formally evaluating explainable AI methods, building a generic XAI software toolkit for medical applications, as well as understanding situations in which predictions cannot be trusted using XAI.
Dr. Fan has published refereed papers at AI conferences and journals including AAMAS, IJCAI, AAAI, ICRA, ECAI, and AIJ. He was the recipient of the Welsh Crucible Leadership Programme Award, the Best Paper Award at IEEE ICSGAH, the first prize in the Health Innovation Technology Challenge, and the Rector’s Award Scholarship at Imperial College London. While working in the UK, he led projects using Explainable AI to Understand the Impacts of Non-pharmaceutical Control Measures on COVID-19 Transmission for Evidence-based Policy and Understanding the Impact of COVID-19 on the International Economy via Currency Markets using Explainable AI, both funded by the Welsh Government.
Explainable AI, Digital Health
- Empowering Digital Health through Predictive Analysis on Electronic Health Records with Trustworthy AI
- AI-Powered Transformation of Diabetic Macular Edema (DME) Management: Personalized Predictive Analytics for Enhanced Care