Anji Liu
@liu_anji
Incoming Assistant Professor (Presidential Young Professor, PYP) at the National University of Singapore (NUS).
🎓 Looking for PhD students, postdocs & interns! I’m recruiting for my new lab at @NUSComputing, focusing on generative modeling, reasoning, and tractable inference. 💡 Interested? Learn more here: liuanji.github.io 🗓️ PhD application deadline: June 15, 2025
“That’s one small [MASK] for [MASK], a giant [MASK] for mankind.” – [MASK] Armstrong Can autoregressive models predict the next [MASK]? It turns out yes, and quite easily… Introducing MARIA (Masked and Autoregressive Infilling Architecture) arxiv.org/abs/2502.06901
🚀 Exciting news! Our paper "Learning to Discretize Diffusion ODEs" has been accepted as an Oral at #ICLR2025! 🎉 [1/n] We propose LD3, a lightweight framework that learns the optimal time discretization for sampling from pre-trained Diffusion Probabilistic Models (DPMs).
📢 I’m recruiting PhD students @CS_UVA for Fall 2025! 🎯 Neurosymbolic AI, probabilistic ML, trustworthiness, AI for science. See my website for more details: zzeng.me 📬 If you're interested, apply and mention my name in your application: engineering.virginia.edu/department/com…
My group will be hiring several PhD students next year. I can’t reveal the details before the official announcement at NeurIPS, but this involves an exciting collaboration with a well-known non-profit on #AI4Science and serious compute power. Stay tuned and apply at UCI!
I’ll be attending #NeurIPS2024, where I’ll present our spotlight: Scaling Continuous Latent Variable Models as Probabilistic Integral Circuits (PICs). TL;DR: We learn continuous hierarchical mixtures as DAG-shaped PICs, and scale them using neural functional sharing techniques.
Very cool work! It sounds quite similar to Discrete Copula Diffusion by @liu_anji, in that it combines both pretrained diffusion and AR models to improve results x.com/liu_anji/statu…
[1/n] 🚀Diffusion models for discrete data excel at modeling text, but they need hundreds to thousands of diffusion steps to perform well. We show that this is caused by the fact that discrete diffusion models predict each output token *independently* at each denoising step.
Are you looking for an inference algorithm that supports your discrete-continuous probabilistic program? Look no further! We have developed a new probabilistic programming language (PPL) called HyBit that provides scalable support for discrete-continuous probabilistic programs.
I'm very happy to share PyJuice, the culmination of years of scaling up probabilistic & logical reasoning for neural networks and LLMs, all in a PyTorch framework! #NeuroSymbolicAI
📢 Want to train your *tractable* probabilistic models (a.k.a. Probabilistic Circuits or PCs) lightning fast and at scale? We developed PyJuice (github.com/Tractables/pyj…), a lightweight and easy-to-use package based on PyTorch to train your PCs and do inference tasks on them!