I am a postdoctoral researcher working with Emily B. Fox at Stanford University. I work on probabilistic modeling and inference for machine learning. My research aims at bridging the gap between data modeling (e.g., generative models and Bayesian methods) and algorithmic modeling (e.g., neural networks and spectral methods) by addressing challenging inferential questions at the interface of them, such as variational inference and gradient estimation, sampling and optimization, score-based learning, and predictive uncertainty estimation.
Before moving to Stanford I spent two wonderful years (1 year remotely) with the Machine Learning and Statistics group at Microsoft Research New England. I obtained my PhD in Computer Science (2015-2020) from Tsinghua University, advised by Jun Zhu. During my graduate years I have spent a summer at DeepMind as a research scientist intern and visited Vector Institute. I have also spent a summer interning at RIKEN-AIP, Tokyo. I received my B.E. in Computer Science at Tsinghua University.
- Our work on gradient estimation for discrete distributions won the NeurIPS 2022 Outstanding Paper Award!
- I am an area chair for AISTATS 2023.
- I am a top reviewer for NeurIPS 2022.
Probabilistic Inference and Gradient Estimation
A Finite-Particle Convergence Rate for Stein Variational Gradient Descent
Jiaxin Shi, Lester Mackey.
Gradient Estimation with Discrete Stein Operators
Jiaxin Shi, Yuhao Zhou, Jessica Hwang, Michalis K. Titsias, Lester Mackey.
NeurIPS 2022 Outstanding Paper Award.
Double Control Variates for Gradient Estimation in Discrete Latent Variable Models
Michalis K. Titsias, Jiaxin Shi.
Sampling with Mirrored Stein Operators
Jiaxin Shi, Chang Liu, Lester Mackey.
Spotlight Presentation (top 5.1%).
Understanding Deep Learning, Representation Learning
Neural Eigenfunctions Are Structured Representation Learners
Zhijie Deng*, Jiaxin Shi*, Hao Zhang, Peng Cui, Cewu Lu, Jun Zhu.
NeuralEF: Deconstructing Kernels by Deep Neural Networks
Zhijie Deng, Jiaxin Shi, Jun Zhu.
Neural Networks as Inter-domain Inducing Points
Shengyang Sun*, Jiaxin Shi*, Roger Grosse.
Nonparametric Score Estimators
Yuhao Zhou, Jiaxin Shi, Jun Zhu.
Sliced Score Matching: A Scalable Approach to Density and Score Estimation
Yang Song*, Sahaj Garg*, Jiaxin Shi, Stefano Ermon.
Oral Presentation (top 8.7%).
A Spectral Approach to Gradient Estimation for Implicit Distributions
Jiaxin Shi, Shengyang Sun, Jun Zhu.
Predictive Uncertainty Estimation
Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition
Shengyang Sun, Jiaxin Shi, Andrew Gordon Wilson, Roger Grosse.
Sparse Orthogonal Variational Inference for Gaussian Processes
Jiaxin Shi, Michalis K. Titsias, Andriy Mnih.
Best Student Paper Runner-Up at AABI, 2019.
Functional Variational Bayesian Neural Networks
Shengyang Sun*, Guodong Zhang*, Jiaxin Shi*, Roger Grosse.