To discover strong materials, materials scientists need to sequentially select processing parameters and conduct long-time experiments to observe performances. To train a deep neural network, computer scientists need to run computationally expensive hyperparameter tuning experiments on validation sets. In both cases, scientists are solving black-box global optimization problems. However, scientists cannot observe performances of unselected parameters and the experimental cost can be huge. These two challenges call for our actions to develop efficient and robust global optimization algorithms. In this talk, I will present my work on solving global optimization with parametric function approximation and show how to relax the function realizability assumption to misspecification settings.
Chong Liu is a DSI Postdoctoral Scholar at the University of Chicago, and he will join University at Albany - State University of New York as an Assistant Professor of Computer Science starting Fall 2024. His research interests are Machine Learning (Bayesian optimization, bandits, and active learning) and AI for Science (experimental design, materials science, and drug discovery). He is an editorial board reviewer of JMLR and serves on program committees of ICML, NeurIPS, ICLR, AISTATS, AAAI, KDD, and SDM conferences. He is also the lead organizer of the NeurIPS 2023 Workshop on New Frontiers of AI for Drug Discovery and Development.