Paul Jeha
@jeha_paul
PhD student in Copenhagen / Generative Models - Diffusion & exploring. https://pablo2909.github.io/pauljeha/ the work is mysterious and important
Mfs will say "I need to lock-in" and then just paste their error trace into claude code before going back to doomscrolling twitter
The SPIGM Workshop is back at @NeurIPSConf with an exciting new theme at the intersection of probabilistic inference and modern AI models! We welcome submissions on all topics related to probabilistic methods and generative models---looking forward to your contributions!
🌞🌞🌞 The third Structured Probabilistic Inference and Generative Modeling (SPIGM) workshop is **back** this year with @NeurIPSConf at San Diego! In the era of foundation models, we focus on a natural question: is probabilistic inference still relevant? #NeurIPS2025
When sampling from multimodal distributions, we rely on multiple temperatures to balance exploration and exploitation. Can we bring this idea into the world of diffusion-based neural samplers? 👉Check out our ICML paper to see how this idea can lead to significant improvements!
Exited to share our new paper accepted by ICML 2025 👉 “PTSD: Progressive Tempering Sampler with Diffusion” , which aims to make sampling from unnormalised densities more efficient than state-of-the-art methods like parallel tempering. Check our threads below 👇
TorchDR 0.3 is here with some major improvements, taking the library to the next level! TorchDR leverages vectorized implementation on GPU for super fast dimensionality reduction. Thanks to all the contributors!! Description below🧵
In an hour, François and I are presenting at ICML our paper on crystalline material generation using diffusion models, where the key innovation is a diffusion process for the fractional coordinates that is inspired by kinetic Langevin dynamics. Paper: arxiv.org/abs/2507.03602
[1/9]🚀Excited to share our new work, RNE! A plug-and-play framework for everything about diffusion model density and control: density estimation, inference-time control & scaling, energy regularisation. More details👇 Joint work with @jmhernandez233 @YuanqiD, Francisco Vargas
Heading to @MSFTResearchCam over the summer to intern and work on some cool diffusion stuff 🫶🏽 hit me up if you want to grab a coffee
Francisco Vargas and @msalbergo with a combined talk on sampling inference and transport at #FPIworkshop.
Heading to @iclr_conf 2025 tomorrow in Singapore! I’ll be giving a talk at the FPI workshop on our most recent work with @brianlee_lck . Hit me up if you want to chat, or have some good coffee ! arxiv.org/pdf/2502.06079
We have a new cool preprint with @brianlee_lck. We developed a sweet Sequential Monte Carlo algorithm for unbiased samples from a tempered distribution p(x0)p(y|x0)^α and applied it to discrete diffusion for text arxiv.org/abs/2502.06079. Huge thanks to Francisco Vargas @msalbergo…

Tweeting again about sampling, my favourite 2024 Monte Carlo paper is arxiv.org/abs/2307.01050 by F. Vargas, @shreyaspadhy, D. Blessing & N. Nüsken: . Propose a "simple" loss to learn the drift you need to add to Langevin to follow a fixed probability path.
Honored to receive the Jorck's Fund Research Award (DKK450,000) for my research and teaching in generative AI given at the Danish Supreme Court! 🎉 Grateful for the incredible support of my PhD students, postdocs, collaborators and family. 🙏 Details: dtu.dk/english/news/n……
Poster session happening *today* at 4:30 local time. *East* Exhibit Hall. Poster #3511. Looking forward to presenting this work! See you there? 🙂
🛩️ On my way to #NeurIPS2024 and excited to chat about (ML applications of) linear algebra, differentiable programming, and probabilistic numerics! Feel free to DM if you’d like to meet up, hang out, and/or discuss any of these topics 😊
Heading to Vancouver for NeurIPS to present our paper “On Conditional Diffusion Models for PDE Simulation”. I'll be together with Sasha and @CristianaD2202 at poster 2500 during Thursday’s late afternoon session. Looking forward exciting discussions and meeting new people!
🛩️ On my way to #NeurIPS2024 and excited to chat about (ML applications of) linear algebra, differentiable programming, and probabilistic numerics! Feel free to DM if you’d like to meet up, hang out, and/or discuss any of these topics 😊
Ever thought about using matrix-exp's or log-det's in large-scale ML (think PDEs/GGN matrices with >1M rows)? I have, and maybe you have, too—but gradients? That's where it gets tricky. Not anymore! We have a #NeurIPS2024 spotlight: "Gradients of Functions of Large Matrices."🧵
diffusion models are just a ploy by CS PhDs to get their departments to pay for them to learn stochastic calculus so that they can get finance jobs
Ever thought about using matrix-exp's or log-det's in large-scale ML (think PDEs/GGN matrices with >1M rows)? I have, and maybe you have, too—but gradients? That's where it gets tricky. Not anymore! We have a #NeurIPS2024 spotlight: "Gradients of Functions of Large Matrices."🧵
"A tutorial on automatic differentiation with complex numbers" (by Nicholas Krämer): arxiv.org/abs/2409.06752
Landed in Vienna for icml ! Reach out if you want to chat or grab a (specialty) coffee