Hey I’m Kai

I’m a research scientist at the MIT-IBM Watson AI Lab working on probabilistic methods at scale. My research interests lie primarily in learning, inference and alignment methods for deep probabilistic models & their (under-explored) real-world applications. I’m proudly an original member of the Turing team that builds the Turing probabilistic programming language in Julia.

I did my Ph.D with Charles Sutton at the Institute for Adaptive and Neural Computation, University of Edinburgh, where I met the other two Cat Squad memebers Akash Srivastava (who is also at MIT-IBM) and Cole Hurwitz. Before this, I worked with Zoubin Ghahramani as a master student and research assistant at the Cambridge Machine Learning Group, University of Cambridge, where I started working on the Turing project together with Hong Ge.

\[\ast\] denotes equal contributions

Papers

Conference

  • Shivchander Sudalairaj\[\ast\], Abhishek Bhandwaldar\[\ast\], Aldo Pareja\[\ast\], Kai Xu, David D. Cox, Akash Srivastava\[\ast\]. “LAB: Large-Scale Alignment for ChatBots”, 2024. [arXiv, Merlinite-7B, Labradorite-13b]
  • Chi-Jen (Roger) Lo, Mahesh K. Marina, Nishanth Sastry, Kai Xu, Saeed Fadaei and Yong Li. “Shrinking VOD Traffic via Rényi-Entropic Optimal Transport”, Proceedings of the ACM on Measurement and Analysis of Computing Systems (SIGMETRICS), 2024. [ACM]
  • Akash Srivastava, Seungwook Han, Kai Xu, Benjamin Rhodes, Michael U. Gutmann. “Estimating the Density Ratio between Distributions with High Discrepancy using Multinomial Logistic Regression”, Transactions on Machine Learning Research (TMLR), 2023. [OpenReview]
  • Charlotte Loh, Seungwook Han, Shivchander Sudalairaj, Rumen Dangovski, Kai Xu, Florian Wenzel, Marin Soljacic, Akash Srivastava. “Multi-Symmetry Ensembles: Improving Diversity and Generalization via Opposing Symmetries”, To appear in International Conference on Machine Learning (ICML), 2023. [arXiv]
  • Kai Xu, Georgi Ganev, Emile Joubert, Rees Davison, Olivier Van Acker, Luke Robinson. “Synthetic Data Generation of Many-to-Many Datasets via Random Graph Generation”. International Conference on Learning Representations (ICLR), 2023. [OpenReview]
    • Work done at Hazy, a startup on synthetic data generation where I spent 9 wonderful months after my PhD.
  • Kai Xu, Akash Srivastava, Dan Gutfreund, Felix Sosa, Tomer Ullman, Joshua B. Tenenbaum, Charles Sutton. “A Bayesian-Symbolic Approach to Learning and Reasoning for Intuitive Physics.”, Neural Information Processing Systems (NeurIPS), 2021. [OpenReview, website, code]
  • Cole L. Hurwitz, Akash Srivastava, Kai Xu, Justin Jude, Matt Perich, Lee E. Miller, Matthias H. Hennig. “Targeted Neural Dynamical Modeling.”, Neural Information Processing Systems (NeurIPS), 2021. [OpenReview]
  • Kai Xu\[\ast\], Tor Erlend Fjelde\[\ast\], Charles Sutton, Hong Ge. “Couplings for Multinomial Hamiltonian Monte Carlo.”, International Conference on Artificial Intelligence and Statistics (AISTATS), 2021. Oral (top 10% of accepted papers) [abs, pdf, arXiv, code]
  • Benjamin Rhodes, Kai Xu, Michael U. Gutmann “Telescoping Density-Ratio Estimation.”, Neural Information Processing Systems (NeurIPS), 2020. Spotlight (top 20% of accepted papers) [abspdfarXiv]
  • Akash Srivastava\[\ast\], Kai Xu\[\ast\], Michael U. Gutmann and Charles Sutton. “Generative Ratio Matching Networks.”, International Conference on Learning Representations (ICLR), 2020. [pdf, OpenReview, code]
  • Cole L. Hurwitz, Kai Xu, Akash Srivastava, Alessio Paolo Buccino and Matthias Hennig. “Scalable Spike Source Localization in Extracellular Recordings using Amortized Variational Inference.”, Neural Information Processing Systems (NeurIPS), 2019. [abspdf, arXiv]
  • Kai Xu, Akash Srivastava and Charles Sutton. “Variational Russian Roulette for Deep Bayesian Nonparametrics.”, International Conference on Machine Learning (ICML), 2019. [abspdfsupplcode]
  • Hong Ge, Kai Xu, and Zoubin Ghahramani. “Turing: A Language for Flexible Probabilistic Inference.”, International Conference on Artificial Intelligence and Statistics (AISTATS), 2018. [abspdfcodewebsite]

Deep Generative Models for Mobile Networking (collaboration with Mahesh’s group)

  • Chuanhao Sun, Kai Xu, Mahesh K. Marina, Howard Benn. “GenDT: Mobile Network Drive Testing Made Efficient with Generative Modeling”. International Conference on emerging Networking EXperiments and Technologies (ACM CoNEXT; 19% acceptance rate), 2022.
  • Chuanhao Sun, Kai Xu, Marco Fiore, Mahesh K. Marina, Yue Wang, Cezary Ziemlicki. “AppShot: A Conditional Deep Generative Model for Synthesizing Service-Level Mobile Traffic Snapshots at City Scale”. IEEE Transactions on Network and Service Management (IEEE TNSM), 2022. [IEEE]
  • Kai Xu\[\ast\], Rajkarn Singh\[\ast\], Hakan Bilen, Marco Fiore, Mahesh K. Marina, Yue Wang. “CartaGenie: Context-Driven Synthesis of City-Scale Mobile Network Traffic Snapshots”, International Conference on Pervasive Computing and Communications (IEEE PerCom; acceptance rate < 20%), 2022. [IEEE, dataset]
  • Kai Xu, Rajkarn Singh, Marco Fiore, Mahesh K. Marina, Hakan Bilen, Muhammad Usama, Howard Benn, Cezary Ziemlicki. “SpectraGAN: Spectrum based Generation of City Scale Spatiotemporal Mobile Network Traffic Data”, International Conference on emerging Networking EXperiments and Technologies (ACM CoNEXT; 17% acceptance rate), 2021. [ACM, dataset]

Workshop

  • Kai Xu, Hong Ge, Will Tebbutt, Mohamed Tarek, Martin Trapp, Zoubin Ghahramani. “AdvancedHMC.jl: A robust, modular and efficient implementation of advanced HMC algorithms.”, Symposium on Advances in Approximate Bayesian Inference (AABI), 2019. [abspdfOpenReview]
  • Tor Erlend Fjelde, Kai Xu, Mohamed Tarek, Sharan Yalburgi, Hong Ge. “Bijectors.jl: Flexible transformations for probability distributions.”, Symposium on Advances in Approximate Bayesian Inference (AABI), 2019. [abspdfOpenReview]

Preprint

  • Simao Eduardo, Kai Xu, Alfredo Nazabal, Charles Sutton. “Repairing Systematic Outliers by Learning Clean Subspaces in VAEs.”, arXiv preprint arXiv:2207.08050, 2022. [arXiv]
  • Akash Srivastava, Yamini Bansal, Yukun Ding, Cole Hurwitz, Kai Xu, Bernhard Egger, Prasanna Sattigeri, Josh Tenenbaum, David Cox, Dan Gutfreund. “Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modelling.”, arXiv preprint arXiv: 2010.13187, 2020. [arXiv]
  • Mohamed Tarek, Kai Xu, Martin Trapp, Hong Ge, Zoubin Ghahramani. “DynamicPPL: Stan-like Speed for Dynamic Probabilistic Models.”, arXiv preprint arXiv: 2002.02702, 2020. [arXiv]
  • Kai Xu, Dae Hoon Park, Yi Chang and Charles Sutton. “Interpreting Deep Classifiers by Visual Distillation of Dark Knowledge.”, arXiv preprint arXiv: 1803.04042, 2018. [arXivpdfdemocodewebsite]

Software

  • Turing.jl: Bayesian inference with probabilistic programming. [GitHubwebsite]
  • AdvancedHMC.jl: Robust, modular and efficient implementation of advanced Hamiltonian Monte Carlo algorithms. [GitHub]
  • DensityRatioEstimation.jl: A Julia package for density ratio estimation. [GitHub]

Professional Services

Journal reviewing

Journal of Machine Learning Research (JMLR)

Conference reviewing

Neural Information Processing Systems (NeurIPS), International Conference on Machine Learning (ICML), International Conference on Learning Representations (ICLR) and International Conference on Artificial Intelligence and Statistics (AISTATS)

Pre-Ph.D / Casual Projects

  • Mobile Robot Control using ROS [poster]
  • DBD Plasma Reactor Power Monitoring System
  • Wireless Brain-Computer Interface for Game Control [video]
  • FlatShare: A cost splitting app for flatmates [code]
  • Gravity Snake [democode]: You control the snake via accelerometers—the snake always heads downwards and you need to steer it by rotating your phone!

Hobbies

Created via Emacs with Org mode | Styled in Nord | Updated on 22 Mar 2024