Academic Profile

Dr. Mao obtained his BEng, MEng and PhD from Jinan University, Northeastern University, and University of Sheffield in 1989, 1992 and 1998 respectively. Since obtaining his PhD, he has been working at School of Electrical and Electronic Engineering, Nanyang Technological University, Singapore, where he is an Associate Professor. Dr. Mao has over 20 years of experience in artificial intelligence, machine learning, image processing, natural language processing, and information fusion. He has published about 100 scientific papers in the field. His research is recognized in the Top 2% Scientists Worldwide 2021 in a study from Stanford University, where his research in AI is ranked at top 0.5%. As a strong advocate of translational research, he has successfully developed and delivered several intelligent systems and tools to government agencies and industries. Besides research and development work, Dr. Mao has been conducting consultancy work and training courses in the field.
ekzmao_1_2.JPG picture
Assoc Prof Mao Kezhi
Associate Professor, School of Electrical & Electronic Engineering

Machine Learning, Deep Learning for Big Data, Natural Language Processing, Image/Video Processing, Information Fusion, and Cognitive Science
  • Project CROSSON (IMA)
  • ACT: an Attentive Convolutional Transformer for Efficient Text Classification. Proceedings of the 35th AAAI Conference on Artificial Intelligence (AAAI-21): 13261-13269, 2021

  • Partial Video Domain Adaptation with Partial Adversarial Temporal Attentive Network, ICCV2021, accepted, 2021.

  • Effective Action Recognition with Embedded Key Point Shifts. Pattern Recognition. Vol. 120: 108172, 2021.

  • Bag-of-Concepts Representation for Document Classification based on Automatic Knowledge Acquisition from Probabilistic Knowledge Base. Knowledge-Based Systems, Vol. 193: 105436, 2020.

  • Improving Relation Extraction with Knowledge-attention. Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pp229-239, 2019