Kevin Han Huang
@KevinHanHuang1
Incoming postdoc @ProbAIHub, hosted @warwickstats @Princeton @orfe. PhD @GatsbyUCL. Works on universality and ML theory. He/him. 🌈
🚨 PhD opportunity! I’m recruiting a PhD student with a strong background in math/stats/CS to join my group at NTU Singapore 🇸🇬 Start: Jan or Aug 2026 Topic: Foundations of Epistemic Uncertainty in ML 🧠🔍 📌 Details: chau999.github.io/group/ RT appreciated!
A better photo, in retrospect 😄
🎊 Congratulations to Kevin Huang (@KevinHanHuang1) on passing his PhD viva with minor corrections! 🥳 🔖 Universality beyond the classical asymptotic regime
Long shot but does anyone still need Airbnb for #ICML2025 ? I have booked everything to go but just got diagnosed with an acute disease that now I can't go 😭😭 since not going, the funding body might not reimburse the costs. high appreciate retweeting or contacting re Airbnb 🙏🏻
Would love help identifying amazing ML researchers with strong connections to Canada who are currently outside Canada (thus potentially targets for recruitment as US situation deteriorates). DMs please. Retweet please.
Today is the day of the pre-ICML event at UCL! Come check out the exciting work from academics, industry researchers, postdocs and PhD students from around London: sites.google.com/view/pre-icml-… @stats_UCL @uclcsml
Congratulations to my former student, Jun Yang, for winning a Sapere Aude !! Jun has opened up a number of new areas of high-dimensional Monte Carlo analysis and Monte Carlo algortithms since graduating. dff.dk/en/our-funded-…
The Hub is recruiting! The University of Warwick are searching for a Research Fellow - check it out and apply here➡️ shorturl.at/ELQNR
📢 @Cambridge_Uni & @SciTechgovuk announce prestigious Spärck #AI Scholarships to support next generation of AI leaders. The scholarships, aimed at domestic and international students, will open to its first cohort in the 2026/27 academic year. cam.ac.uk/news/cambridge…
Looking forward to the pre-ICML event at UCL on the 3rd July: sites.google.com/view/pre-icml-…. Registration and the call for talks/posters are now open!
Meanwhile, excited to be in #Lyon for #COLT2025, with a co-first author paper (arxiv.org/abs/2502.15752) with the amazing team -- Matthew Mallory and our advisor Morgane Austern! Keywords: Gaussian universality, dependent data, convex Gaussian min-max theorem, data augmentation!
Missing ICML due to visa :'(, but looking forward to share our ICML paper (arxiv.org/abs/2502.05318) as a poster at #BayesComp, Singapore! Work on symmetrising neural nets for schrodinger equation in crystals, with the amazing Zhan Ni, Elif Ertekin, Peter Orbanz and @ryan_p_adams
Introducing All-atom Diffusion Transformers — towards Foundation Models for generative chemistry, from my internship with the FAIR Chemistry team @OpenCatalyst @AIatMeta There are a couple ML ideas which I think are new and exciting in here 👇
Writing math proof in Lean is surprisingly addictive. Watching Terence Tao formalize Lean proofs feels like watching a top-tier gamer playing on Twitch. :-) youtube.com/watch?v=c1ixXM…
This is a huge development. I want to highlight the theoreticians behind the scene, because this paper represents the realization of the impact of years of careful theoretical research. It starts with Greg Yang (@TheGregYang) opening up research on the muP scaling and…
(1/7) @CerebrasSystems Paper drop: arxiv.org/abs/2505.01618 TLDR: We introduce CompleteP, which offers depth-wise hyperparameter (HP) transfer (Left), FLOP savings when training deep models (Middle), and a larger range of compute-efficient width/depth ratios (Right). 🧵 👇
🚨 Deep learning theory improves transformers: ** HP transfer in depth & width ** compute efficient pre-training Really fun collab with @DeyNolan, @clairezhang, @mufan_li, @lorenzo_noci, @blake__bordelon, @ShaneBergsma, Joel Hestness, @CPehlevan, @CerebrasSystems
(1/7) @CerebrasSystems Paper drop: arxiv.org/abs/2505.01618 TLDR: We introduce CompleteP, which offers depth-wise hyperparameter (HP) transfer (Left), FLOP savings when training deep models (Middle), and a larger range of compute-efficient width/depth ratios (Right). 🧵 👇
Oral #AISTATS25 Robust Kernel Hypothesis Testing under Data Corruption -Robustify any permutation test to be immune to data corruption of up to X samples -MMD HSIC robust minimax optimality Monday 5 May -Oral Session 7 Robust Learning -Poster Session 3 Presented by Ilmun Kim
📢 We have an opportunity for students to join our PhD programme in Theoretical Neuroscience and Machine Learning this September. Application deadline is 27 May 2025. Information & how to apply 👉 ucl.ac.uk/gatsby/study-a…
I am truly excited to share our latest work with @MScherbela, @GrohsPhilipp, and @guennemann on "Accurate Ab-initio Neural-network Solutions to Large-Scale Electronic Structure Problems"! arxiv.org/abs/2504.06087