Biography

My research interests are in machine learning, optimization, and natural language processing. I have published ~140 papers in ML (NeurIPS, ICML, ICLR), NLP (ACL, EMNLP, NAACL), CV (CVPR, ICCV, ECCV), DM (KDD, ICDM), AI (AAAI, IJCAI) conferences, and journals as Machine Learning (Springer), IEEE TPAMI/TIP/TNNLS/TKDE, etc. Our recent works mainly focus on:

  • Human-AI Hybrid Intelligence on Curriculum Learning, Training Dynamics, Unsupervised Exploration, Human-AI Alignment & Teaming, Theory-of-Mind, Co-Education, etc;
  • Training Generative AI for better controllability, efficiency, and reasoning skills;
  • Synthetic data/tasks, self-evolving creative AI, and auto-benchmarking;
  • Neuro-symbolic, Physics-Informed World Models & Embodied Multi-modal Agents;
  • Mixture-of-Experts, Multi-Agent, and Collaborative Learning;
  • Memorization and Generalization mechanism in Foundation Models.

Our studies are built upon recent LLMs, unified multi-modal models, RL, agentic workflows, to address practical challenges in education, design, medical health, visualization, embodied intelligence, autonomous driving, etc. Our goal is to develop efficient, versatile, trustworthy, and environmentally-friendly hybrid-intelligence based on coevolution between humans and machines. Our code/data/models can be found at Tianyi Lab’s GitHub and HF.

I was a visiting research scientist at Google between 2021-2022, hosted by Prof. Boqing Gong and Prof. Ming-Hsuan Yang. Before that, I received my Ph.D. (thesis) from Computer Science of University of Washington, where I was a member of MELODI lab led by Prof. Jeff A. Bilmes. I have been working with Prof. Dacheng Tao as a research assistant at University of Technology, Sydney (UTS) and Nanyang Technological University. I was a research intern at Yahoo! Labs, mentored by Dr. Hua Ouyang (Apple) and Prof. Yi Chang (Jilin University), and a research intern at Microsoft Research, mentored by Dr. Lin Xiao (Meta AI).

News

Research Topics