Lester Mackey
@LesterMackey
Machine learning researcher @MSFTResearch (@MSRNE); adjunct professor @Stanford
Great to see our paper presenting recall, a framework which calibrates clustering for the impact of data "double-dipping" in single-cell studies, out today in AJHG! Congratulations, @AlanDenadel and co-authors!
🚨Online now! 📄Artificial variables help to avoid over-clustering in single-cell RNA sequencing 🧑🤝🧑 @alandenadel & colleagues cell.com/ajhg/abstract/…
So you want to skip our thinning proofs—but you’d still like our out-of-the-box attention speedups? I’ll be presenting the Thinformer in two ICML workshop posters tomorrow! Catch me at Es-FoMo (1-2:30, East hall A) and at LCFM (10:45-11:30 & 3:30-4:30, West 202-204)
Your data is low-rank, so stop wasting compute! In our new paper on low-rank thinning, we share one weird trick to speed up Transformer inference, SGD training, and hypothesis testing at scale. Come by ICML poster W-1012 Tuesday at 4:30!
If you’re not at ICML, don’t worry! You can still read our work. Our new theoretically principled algorithms beat recent baselines across multiple tasks—including Transformer approximation! arxiv.org/abs/2502.12063
Your data is low-rank, so stop wasting compute! In our new paper on low-rank thinning, we share one weird trick to speed up Transformer inference, SGD training, and hypothesis testing at scale. Come by ICML poster W-1012 Tuesday at 4:30!
At ICML this week? Check out @annabelle_cs's paper in collaboration with @LesterMackey and colleagues on Low-Rank Thinning! ⏰ Tue 15 Jul 4:30 - 7 p.m. PDT New theory, dataset compression, efficient attention and more: icml.cc/virtual/2025/p…
At ICML this week? Check out @annabelle_cs's paper in collaboration with @LesterMackey and colleagues on Low-Rank Thinning! ⏰ Tue 15 Jul 4:30 - 7 p.m. PDT New theory, dataset compression, efficient attention and more: icml.cc/virtual/2025/p…
It's Hard to Be Normal: The Impact of Noise on Structure-agnostic Estimation ift.tt/RcE5Hby
Causal integration of chemical structures improves representations of microscopy images for morphological profiling. arxiv.org/abs/2504.09544
Exciting news in the global statistics community! Grace Wahba was awarded the prestigious 2025 International Prize in Statistics for her groundbreaking work on smoothing splines, which revolutionized data analysis and machine learning. https://www. statprize.org/index.cfm
We are pleased to announce that the Ethel Newbold Prize 2025 has been awarded to Po-Ling Loh from the Department of Pure Mathematics and Mathematical Statistics (DPMMS), University of Cambridge. CONGRATULATIONS!
Congratulations to the winners of the inaugural David Cox Medal for Statistics, Prof Eric Tchetgen Tchetgen (University of Pennsylvania), Prof Nancy Zhang (University of Pennsylvania) and Prof Richard Samworth (University of Cambridge)!
Interested in learning about Kernel Discrepancies❓ Maximum Mean Discrepancy Hilbert-Schmidt Independence Criterion Kernel Stein Discrepancy 🧐 Don't know where to begin? 👀 Check out my Practical Introduction to Kernel Discrepancies: MMD, HSIC & KSD! arxiv.org/abs/2503.04820
Please join us in congratulating this year's winners! You can read more about the winners and the awards on our website: community.amstat.org/copss/home
NeurIPS 2025 is soliciting self-nominations for reviewers and ACs. Please read our blog post for details on eligibility criteria, and process to self-nominate: blog.neurips.cc/2025/03/10/sel…
🙌🎉Our 2025 recipient of the COPSS Presidents' Award, is Lester Mackey! This award is given annually to a young member of the statistical community in recognition of outstanding contributions to the profession of statistics.
🎉 📷 Excited to announce our 2025 COPSS Emerging Leader Award (ELA) winners. The COPSS ELA recognizes early career statistical scientists who show evidence of and potential for leadership and who will help shape and strengthen the field.
presenting CleaveNet: an AI pipeline for the generative design of protease substrates We designed novel, selective substrates for MMPs by conditioning on an enzyme activity profile and validated them through a large-scale in vitro screen. for protease biology and beyond! 🚀
Single cell foundation models have the potential to revolutionize many areas of biology. How does the composition of training data affect the performance of these models? In my intern project @MSRNE, we identify patterns that can augment next-generation models.