Building probabilistic ML methods — variational inference, Gaussian processes, differentiable programming — and deploying them on some of the largest datasets in astrophysics.
I develop machine learning and probabilistic inference methods and apply them to scientific problems in astrophysics. As a KIPAC Fellow at KIPAC, Stanford University, I build scalable Bayesian algorithms — including variational inference, Gaussian processes, normalizing flows, and differentiable programming frameworks — that turn noisy, incomplete observational data into high-fidelity reconstructions of physical systems.
My primary application domain is 3D mapping of the interstellar medium, where I use ML to push reconstructions to unprecedented scales in both size and resolution and to incorporate multiple tracers for a comprehensive picture of Galactic structure. This work sheds light on the mechanisms of star formation and galaxy dynamics across scales only accessible through our unique vantage point within the Milky Way.
I did my PhD and a followup Postdoc at the Faculty of Physics at LMU Munich and the Max-Planck-Institute for Astrophysics, where I built the theoretical foundations for geometric variational inference and contributed ML-driven applications spanning radio interferometric imaging (M87* black hole), X- and gamma-ray imaging, cosmic-ray air-shower reconstruction, and 3D Galactic dust and gas mapping.
Building ML methods for scientific inference — from variational algorithms and scalable GPs to real-world deployments on billion-object datasets.
A variational inference algorithm that exploits information geometry to navigate complex, high-dimensional posterior landscapes. Uses coordinate transformations to turn difficult posteriors into near-Gaussian problems, enabling scalable approximate inference.
A benchmark for evaluating whether AI agents can replicate full astrophysics research papers — testing faithfulness and correctness across experimental setup, data analysis, and code generation on expert-validated tasks.
Core developer of NIFTy — a JAX-based probabilistic programming framework for Gaussian processes, variational inference, and differentiable forward modeling. Powers 20+ published scientific reconstructions.
See also: Google Scholar
Scalable Gaussian processes via nearest-neighbor graphs — enabling GP regression on millions of data points with GPU acceleration (CUDA).
JAX-based probabilistic programming framework for Gaussian processes, geometric variational inference, and differentiable forward modeling.
JAX-accelerated Universal Bayesian Imaging Kit — differentiable inference pipelines for multi-domain astrophysical imaging.
Bayesian strong gravitational lensing reconstruction using structured generative models and variational inference.
Non-parametric Bayesian density estimation for Poisson-distributed count data with learned correlation structure.
Probabilistic imaging pipeline for dynamic radio interferometry — joint inference of time-varying structure from sparse Fourier-plane data.