Neta Shaul
@shaulneta
PhD Student at @WeizmannScience
DTM vs FMπ Lots of interest in how Difference Transition Matching (DTM) connects to Flow Matching (FM). Here is a short animation that illustrates Theorem 1 in our paper: For a very small step size (1/T), DTM converges to an Euler step of FM.
[1/n] New paper alert! π Excited to introduce ππ«ππ§π¬π’ππ’π¨π§ πππππ‘π’π§π (ππ)! We're replacing short-timestep kernels from Flow Matching/Diffusion with... a generative modelπ€―, achieving SOTA text-2-image generation! @urielsinger @itai_gat @lipmanya
If you're curious to dive deeper into Transition Matching (TM)β¨π, a great starting point is understanding the similarities and differences between ππ’ππππ«ππ§ππ ππ«ππ§π¬π’ππ’π¨π§ πππππ‘π’π§π (πππ) and Flow Matching (FM)π‘.
[1/n] New paper alert! π Excited to introduce ππ«ππ§π¬π’ππ’π¨π§ πππππ‘π’π§π (ππ)! We're replacing short-timestep kernels from Flow Matching/Diffusion with... a generative modelπ€―, achieving SOTA text-2-image generation! @urielsinger @itai_gat @lipmanya
Difference Transition Matching (DTM) process is so simple to Illustrate, you can calculate it on a whiteboard! At each step: Draw all lines connecting source and target (shaded) β¬οΈ List those intersecting with the current state (yellow) β¬οΈ Sample a line from the list (green)
[1/n] New paper alert! π Excited to introduce ππ«ππ§π¬π’ππ’π¨π§ πππππ‘π’π§π (ππ)! We're replacing short-timestep kernels from Flow Matching/Diffusion with... a generative modelπ€―, achieving SOTA text-2-image generation! @urielsinger @itai_gat @lipmanya
π Excited to share our latest work led by @itaigat! We incorporate corrector sampling into autoregressive models for text generation β achieving significant gains in code generation performance. Check it out π
Excited to share our recent work on corrector sampling in language models! A new sampling method that mitigates error accumulation by iteratively revisiting tokens in a window of previously generated text. With: @shaulneta @urielsinger @lipmanya Link: arxiv.org/abs/2506.06215
Against conventional wisdom, I will be giving a talk with particular focus on the "how" and the various intricacies of applying stochastic control for generative modeling. Mon 9:50am Hall 1 Apex #ICLR2025 Also check out the other talks at delta-workshop.github.io!
Had an absolute blast presenting at #ICLR2025! Thanks to everyone who came to visit my posterπ Special shoutout to @drscotthawley for taking a last-minute photo πΈ
π£I'll be at the poster session with our follow-up on Discrete Flow Matching. We derive a closed-form solution to the kinetic optimal problem for conditional velocity on discrete spaces. Into flow models? come chat! π¬ πPoster: Sat 10am (#191), π€Oral: Sat 3:30pm (6E) #ICLR2025
We are presenting 3 orals and 1 spotlight at #ICLR2025 on two primary topics: On generalizing the data-driven flow matching algorithm to jump processes, arbitrary discrete corruption processes, and beyond. And on highly scalable algorithms for reward-driven learning settings.
A new (and comprehensive) Flow Matching guide and codebase released! Join us tomorrow at 9:30AM @NeurIPSConf for the FM tutorial to hear more... arxiv.org/abs/2412.06264 github.com/facebookresearβ¦
π£ A new #ICML2023 paper investigates the Kinetic Energy of Gaussian Probability Paths which are key in training diffusion/flow models. A surprising takeaway: In high dimensions *linear* paths (Cond-OT) are Kinetic Optimal! Led by @shaulneta w/ @RickyTQChen @lematt1991 @mnick