I am a tenure-track assistant professor of Computer Science and UMIACS at University of Maryland, College Park. My research interests are in machine learning, optimization, and natural language processing. I have published ~80 papers at NeurIPS, ICML, ICLR, AISTATS, ACL, EMNLP, NAACL, COLING, CVPR, KDD, ICDM, AAAI, IJCAI, ISIT, Machine Learning (Springer), IEEE TIP/TNNLS/TKDE, etc. I am the recipient of the Best Student Paper Award at ICDM 2013 and the 2020 IEEE TCSC Most Influential Paper Award.

My recent works study how, why, and when to translate human learning strategies (e.g., curriculum, retention, sub-tasking, curiosity, exemplar learning, collaborative learning, etc.) to improve machine learning in the wild (e.g., with unlabeled, biased, noisy, redundant or distributed data, extrapolation to unseen tasks/environments). Our works are built upon empirical/theoretical analysis to the learning dynamics of neural networks and tools from discrete and continuous optimization. Our goal is to develop efficient, versatile, trustworthy, and environmentally-friendly hybrid-intelligence based on coevolution between human and machine. A list of my research topics can be found below.

I was a visiting research scientist at Google between 2021-2022. Before that, I was a Ph.D. student in Computer Science at University of Washington and a member of MELODI lab led by Prof. Jeff A. Bilmes. I have been a research assistant at University of Technology, Sydney (UTS) and Nanyang Technological University (NTU), supervised by Prof. Dacheng Tao (University of Sydney). I was a research intern at Yahoo! Labs, supervised by Hua Ouyang (Apple) and Yi Chang (Jilin University), and a research intern at Microsoft Research, supervised by Lin Xiao (Facebook AI Research). I also work closely with several members and students of Australian AI Institute.


My Research