Hey I’m Kai
Scholar | GitHub |
I’m a research scientist at the MIT-IBM Watson AI Lab, working on probabilistic methods at scale. My research interests lie primarily in learning, inference and alignment methods for deep probabilistic models & their (under-explored) real-world applications. I’m proudly an original member of the TuringLang team that builds the Turing probabilistic programming language in Julia. I’m also part of the open-source LLM project InstructLab, where I led the initial development of the InstructLab CLI.
I did my Ph.D with Charles Sutton at the Institute for Adaptive and Neural Computation, University of Edinburgh, where I met the other two Cat Squad memebers Akash Srivastava (who is also at MIT-IBM) and Cole Hurwitz. Before this, I worked with Zoubin Ghahramani as a master student and research assistant at the Cambridge Machine Learning Group, University of Cambridge, where I started working on the Turing project together with Hong Ge.
denotes equal contributions
Papers
Highlights
- Kai Xu, Farid Tajaddodianfar, Ben Allison. “Improving Reward-Conditioned Policies for Multi-Armed Bandits using Normalized Weight Functions”, 2024. [arXiv]
- Kai Xu, Hong Ge. “Practical Hamiltonian Monte Carlo on Riemannian Manifolds via Relativity Theory”, To appear in International Conference on Machine Learning (ICML), 2024.
- Chuanhao Sun, Zhihang Yuan, Kai Xu, Luo Mai, Siddharth N, Shuo Chen, Mahesh K. Marina. “Learning High-Frequency Functions Made Easy with Sinusoidal Positional Encoding”, To appear in International Conference on Machine Learning (ICML), 2024.
- Shivchander Sudalairaj, Abhishek Bhandwaldar, Aldo Pareja, Kai Xu, David D. Cox, Akash Srivastava. “LAB: Large-Scale Alignment for ChatBots”, 2024. [arXiv, Merlinite-7B, Labradorite-13b, InstructLab]
Conference
- Chi-Jen (Roger) Lo, Mahesh K. Marina, Nishanth Sastry, Kai Xu, Saeed Fadaei and Yong Li. “Shrinking VOD Traffic via Rényi-Entropic Optimal Transport”, Proceedings of the ACM on Measurement and Analysis of Computing Systems (SIGMETRICS), 2024. [ACM]
- Akash Srivastava, Seungwook Han, Kai Xu, Benjamin Rhodes, Michael U. Gutmann. “Estimating the Density Ratio between Distributions with High Discrepancy using Multinomial Logistic Regression”, Transactions on Machine Learning Research (TMLR), 2023. [OpenReview]
- Charlotte Loh, Seungwook Han, Shivchander Sudalairaj, Rumen Dangovski, Kai Xu, Florian Wenzel, Marin Soljacic, Akash Srivastava. “Multi-Symmetry Ensembles: Improving Diversity and Generalization via Opposing Symmetries”, International Conference on Machine Learning (ICML), 2023. [arXiv]
- Kai Xu, Georgi Ganev, Emile Joubert, Rees Davison, Olivier Van Acker, Luke Robinson. “Synthetic Data Generation of Many-to-Many Datasets via Random Graph Generation”. International Conference on Learning Representations (ICLR), 2023. [OpenReview]
- Work done at Hazy, a startup on synthetic data generation where I spent 9 wonderful months after my PhD.
- Kai Xu, Akash Srivastava, Dan Gutfreund, Felix Sosa, Tomer Ullman, Joshua B. Tenenbaum, Charles Sutton. “A Bayesian-Symbolic Approach to Learning and Reasoning for Intuitive Physics.”, Neural Information Processing Systems (NeurIPS), 2021. [OpenReview, code, website
- Cole L. Hurwitz, Akash Srivastava, Kai Xu, Justin Jude, Matt Perich, Lee E. Miller, Matthias H. Hennig. “Targeted Neural Dynamical Modeling.”, Neural Information Processing Systems (NeurIPS), 2021. [OpenReview]
- Kai Xu, Tor Erlend Fjelde, Charles Sutton, Hong Ge. “Couplings for Multinomial Hamiltonian Monte Carlo.”, International Conference on Artificial Intelligence and Statistics (AISTATS), 2021. Oral (top 10% of accepted papers) [abs, pdf, arXiv, code]
- Benjamin Rhodes, Kai Xu, Michael U. Gutmann “Telescoping Density-Ratio Estimation.”, Neural Information Processing Systems (NeurIPS), 2020. Spotlight (top 20% of accepted papers) [abs, pdf, arXiv]
- Akash Srivastava, Kai Xu, Michael U. Gutmann and Charles Sutton. “Generative Ratio Matching Networks.”, International Conference on Learning Representations (ICLR), 2020. [pdf, OpenReview, code]
- Cole L. Hurwitz, Kai Xu, Akash Srivastava, Alessio Paolo Buccino and Matthias Hennig. “Scalable Spike Source Localization in Extracellular Recordings using Amortized Variational Inference.”, Neural Information Processing Systems (NeurIPS), 2019. [abs, pdf, arXiv]
- Kai Xu, Akash Srivastava and Charles Sutton. “Variational Russian Roulette for Deep Bayesian Nonparametrics.”, International Conference on Machine Learning (ICML), 2019. [abs, pdf, suppl, code]
- Hong Ge, Kai Xu, and Zoubin Ghahramani. “Turing: A Language for Flexible Probabilistic Inference.”, International Conference on Artificial Intelligence and Statistics (AISTATS), 2018. [abs, pdf, code, TuringLang]
Deep Generative Models for Mobile Networking (collaboration with Mahesh’s group)
- Chuanhao Sun, Kai Xu, Mahesh K. Marina, Howard Benn. “GenDT: Mobile Network Drive Testing Made Efficient with Generative Modeling”. International Conference on emerging Networking EXperiments and Technologies (ACM CoNEXT; 19% acceptance rate), 2022.
- Chuanhao Sun, Kai Xu, Marco Fiore, Mahesh K. Marina, Yue Wang, Cezary Ziemlicki. “AppShot: A Conditional Deep Generative Model for Synthesizing Service-Level Mobile Traffic Snapshots at City Scale”. IEEE Transactions on Network and Service Management (IEEE TNSM), 2022. [IEEE]
- Kai Xu, Rajkarn Singh, Hakan Bilen, Marco Fiore, Mahesh K. Marina, Yue Wang. “CartaGenie: Context-Driven Synthesis of City-Scale Mobile Network Traffic Snapshots”, International Conference on Pervasive Computing and Communications (IEEE PerCom; acceptance rate < 20%), 2022. [IEEE, dataset]
- Kai Xu, Rajkarn Singh, Marco Fiore, Mahesh K. Marina, Hakan Bilen, Muhammad Usama, Howard Benn, Cezary Ziemlicki. “SpectraGAN: Spectrum based Generation of City Scale Spatiotemporal Mobile Network Traffic Data”, International Conference on emerging Networking EXperiments and Technologies (ACM CoNEXT; 17% acceptance rate), 2021. [ACM, dataset]
Workshop
- Kai Xu, Hong Ge, Will Tebbutt, Mohamed Tarek, Martin Trapp, Zoubin Ghahramani. “AdvancedHMC.jl: A robust, modular and efficient implementation of advanced HMC algorithms.”, Symposium on Advances in Approximate Bayesian Inference (AABI), 2019. [abs, pdf, OpenReview]
- Tor Erlend Fjelde, Kai Xu, Mohamed Tarek, Sharan Yalburgi, Hong Ge. “Bijectors.jl: Flexible transformations for probability distributions.”, Symposium on Advances in Approximate Bayesian Inference (AABI), 2019. [abs, pdf, OpenReview]
Preprint
- Simao Eduardo, Kai Xu, Alfredo Nazabal, Charles Sutton. “Repairing Systematic Outliers by Learning Clean Subspaces in VAEs.”, arXiv preprint arXiv:2207.08050, 2022. [arXiv]
- Akash Srivastava, Yamini Bansal, Yukun Ding, Cole Hurwitz, Kai Xu, Bernhard Egger, Prasanna Sattigeri, Josh Tenenbaum, David Cox, Dan Gutfreund. “Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modelling.”, arXiv preprint arXiv: 2010.13187, 2020. [arXiv]
- Mohamed Tarek, Kai Xu, Martin Trapp, Hong Ge, Zoubin Ghahramani. “DynamicPPL: Stan-like Speed for Dynamic Probabilistic Models.”, arXiv preprint arXiv: 2002.02702, 2020. [arXiv]
- Kai Xu, Dae Hoon Park, Yi Chang and Charles Sutton. “Interpreting Deep Classifiers by Visual Distillation of Dark Knowledge.”, arXiv preprint arXiv: 1803.04042, 2018. [arXiv, pdf, demo, code, website]
Software
Professional Services
Journal reviewing
Journal of Machine Learning Research (JMLR)
Conference reviewing
Neural Information Processing Systems (NeurIPS), International Conference on Machine Learning (ICML), International Conference on Learning Representations (ICLR) and International Conference on Artificial Intelligence and Statistics (AISTATS)
Pre-Ph.D / Casual Projects
- Mobile Robot Control using ROS [poster]
- DBD Plasma Reactor Power Monitoring System
- Wireless Brain-Computer Interface for Game Control [video]
- FlatShare: A cost splitting app for flatmates [code]
- Gravity Snake [demo, code]: You control the snake via accelerometers—the snake always heads downwards and you need to steer it by rotating your phone!
Hobbies
- I’m a fan of Person of Interest (POI). I had a pun on my TiddlyWiki for login saying “I am Root”. My favourite clip of POI is the moment when Welcome To The Machine from Pink Floyd was played. It’s such a perfect fit, and I really feel Nolan made the whole show just for this moment.
- My all-time favourite anime is Fullmetal Alchemist. I used to use a picture of The Dwarf in the Flask as my macOS and Weibo avatar.
- I love e-sports. I play League of Legends from time to time. I was a StarCraft Zerg player.