malkin1729
@FelineAutomaton
Mathematician/informatician thinking probabilistically, expecting the same of you ‘Tis categories in the mind and guns in their hands which keep us enslaved &🦋
Ever the inhabiter of liminal spaces, I can also now be found fluttering in bluer skies 🦋
Finally gave in. I intend to post here mainly for professional purposes and to follow embodied agents who pass the Turing test, whether they are real-life friends, collaborators, neither, or both.
(1/n) The usual assumption in GFlowNet environments is acyclicity. Have you ever wondered if it can be relaxed? Does the existing GFlowNet theory translate to the non-acyclic case? Is efficient training possible? We shed new light on these questions in our latest work! @icmlconf
1/ 💻 Queer in AI is hosting a social at #ICML2025 in Vancouver on 📅 July 16, and you’re invited! Let’s network, enjoy food and drinks, and celebrate our community. Details below…
A great pleasure to crash two Bayesian statistics conferences with a dose of diffusion wisdom — last week in Singapore (bayescomp2025.sg), now in Cambridge (newton.ac.uk/event/rclw03/) — with the two authors of this very nice paper.
🚨 New paper: “Towards Adaptive Self-Normalized IS” TLDR; To estimate µ = E_p[f(θ)] when p(θ) has intractable partition, instead of doing MCMC on p(θ) or learning a parametric q(θ), we try MCMC directly on p(θ)| f(θ)-µ | - variance-minimizing proposal. arxiv.org/abs/2505.00372
Ecstatic to show off some work my brilliant colleagues and I did at @iclr_conf this year! 🚀 We address the credit assignment challenge under long trajectories in RL or GFlowNets by constructing high order actions, or “chunks”, effectively compressing trajectory lengths!
🚀 New Preprint! 🚀 In-Context Parametric Inference: Point or Distribution Estimators? Thrilled to share our work on inferring probabilistic model parameters explicitly conditioned on data, in collab with @Yoshua_Bengio, @FelineAutomaton & @g_lajoie_! 🔗arxiv.org/abs/2502.11617
My PhD thesis entitled "Generative Flow Networks: Theory and Applications to Structure Learning" is now available on Arxiv 🎓 📖 arxiv.org/abs/2501.05498 🔖 Want to learn what GFlowNets are? Check out Chapters 2, 3 & 4!
This week I successfully defended my PhD! 🎓🎊 Many thanks to my committee @dhanya_sridhar @SimonLacosteJ @sirbayes, and a particularly huge thanks to my advisor @Yoshua_Bengio for his incredible support throughout my PhD.
Happy to share one of my last works! If you are interested in diffusion samplers, please take a look🙃! Many thanks for all my colleagues for their intensive work and fruitful collaboration, especially for @FelineAutomaton for leading this project! Stay tuned for the future ones!
Happy to share our latest work on #diffusion models without data: building theoretical bridges between existing methods, analysing their continuous-time asymptotics, and showing some cool practical implications. arxiv.org/abs/2501.06148 #MachineLearning 1/9