I am a postdoctoral researcher working with Emily B. Fox at Stanford University. I work on probabilistic modeling and inference for machine learning. My research aims at bridging the gap between data modeling (e.g., generative models and Bayesian methods) and algorithmic modeling (e.g., neural networks and spectral methods) by addressing challenging inferential questions at the interface of them, such as variational inference and gradient estimation, sampling and optimization, score-based learning, and predictive uncertainty estimation.

Before moving to Stanford I spent two wonderful years (1 year remotely) with the Machine Learning and Statistics group at Microsoft Research New England. I obtained my PhD in Computer Science (2015-2020) from Tsinghua University, advised by Jun Zhu. During my graduate years I have spent a summer at DeepMind as a research scientist intern and visited Vector Institute. I have also spent a summer interning at RIKEN-AIP, Tokyo. I received my B.E. in Computer Science at Tsinghua University.

Github Twitter

News

Selected Publications

MORE

Probabilistic Inference and Gradient Estimation

A Finite-Particle Convergence Rate for Stein Variational Gradient Descent

Jiaxin Shi, Lester Mackey.

Preprint, 2022. [pdf] [abs]

Gradient Estimation with Discrete Stein Operators

Jiaxin Shi, Yuhao Zhou, Jessica Hwang, Michalis K. Titsias, Lester Mackey.

NeurIPS, 2022. [pdf] [abs] [code]

NeurIPS 2022 Outstanding Paper Award.

Double Control Variates for Gradient Estimation in Discrete Latent Variable Models

Michalis K. Titsias, Jiaxin Shi.

AISTATS, 2022. [pdf] [abs] [code]

Sampling with Mirrored Stein Operators

Jiaxin Shi, Chang Liu, Lester Mackey.

ICLR, 2022. [pdf] [abs] [code] [slides]

Spotlight Presentation (top 5.1%).

Understanding Deep Learning, Representation Learning

Neural Eigenfunctions Are Structured Representation Learners

Zhijie Deng*, Jiaxin Shi*, Hao Zhang, Peng Cui, Cewu Lu, Jun Zhu.

Preprint, 2022. [pdf] [abs]

NeuralEF: Deconstructing Kernels by Deep Neural Networks

Zhijie Deng, Jiaxin Shi, Jun Zhu.

ICML, 2022. [pdf] [abs] [code]

Neural Networks as Inter-domain Inducing Points

Shengyang Sun*, Jiaxin Shi*, Roger Grosse.

AABI Symposium, 2020. [pdf] [slides] [video]

Score-based Learning

Nonparametric Score Estimators

Yuhao Zhou, Jiaxin Shi, Jun Zhu.

ICML, 2020. [pdf] [abs] [code] [slides]

Sliced Score Matching: A Scalable Approach to Density and Score Estimation

Yang Song*, Sahaj Garg*, Jiaxin Shi, Stefano Ermon.

UAI, 2019. [pdf] [abs] [code] [video] [blog]

Oral Presentation (top 8.7%).

A Spectral Approach to Gradient Estimation for Implicit Distributions

Jiaxin Shi, Shengyang Sun, Jun Zhu.

ICML, 2018. [pdf] [abs] [code] [slides]

Predictive Uncertainty Estimation

Scalable Variational Gaussian Processes via Harmonic Kernel Decomposition

Shengyang Sun, Jiaxin Shi, Andrew Gordon Wilson, Roger Grosse.

ICML, 2021. [pdf] [abs] [code]

Sparse Orthogonal Variational Inference for Gaussian Processes

Jiaxin Shi, Michalis K. Titsias, Andriy Mnih.

AISTATS, 2020. [pdf] [abs] [code] [slides]

Best Student Paper Runner-Up at AABI, 2019.

Functional Variational Bayesian Neural Networks

Shengyang Sun*, Guodong Zhang*, Jiaxin Shi*, Roger Grosse.

ICLR, 2019. [pdf] [abs] [code] [video]

Software

During my PhD studies I led the development of ZhuSuan [github] [doc] [arxiv], an open-source differentiable probabilistic programming project based on Tensorflow.