Yang Luo
@YangL_7
Ph.D. candidate @NUSingapore | Efficient ML | ACL Outstanding Paper Award
Training-free Video Enhancement: Achieved 🎉 Nice work with @oahzxl @shaowenqi126301 @VictorKaiWang1 @VitaGroupUT @YangYou1991 et al. Non-trivial enhancement, training-free, and plug-and-play 🥳 Blog: oahzxl.github.io/Enhance_A_Vide… (🧵1/6)
Customizing Your LLMs in seconds using prompts🥳! Excited to share our latest work with @HPCAILab, @VITAGroupUT, @k_schuerholt, @YangYou1991, @mmbronstein, @damianborth : Drag-and-Drop LLMs(DnD). 2 features: tuning-free, comparable or even better than full-shot tuning.(🧵1/8)
Representation Alignment (REPA) is NOT ALWAYS helpful for diffusion training!🤷 Sharing latest work w/ @HPCAILab and @VITAGroupUT: "REPA Works Until It Doesn't: Early-Stopped, Holistic Alignment Supercharges Diffusion Training". Acceleration up to 28x w/o performance drop.(🧵1/7)
Glad to see CAME's efficiency in the training of SANA 1.5!
🔥 SANA 1.5: A linear Diffusion Transformer pushes SOTA in text-to-image generation! Key innovations: • Depth-growth training: 1.6B → 4.8B params • Memory-efficient 8-bit optimizer • Flexible model pruning • Inference scaling for better quality Achieves 0.80 on GenEval! 🚀
🚀Towards efficient Diffusion Transformers! 😆We are happy to introduce RAS, the first diffusion sampling strategy that allows for regional variability in sampling ratios, achieving up to 2x+ speedup! 🔌Training-free, plug and play! 💪Nice work with @MSFTResearch @YangYou1991…
Generating ~200 million parameters in just minutes! 🥳 Excited to share our work with @MTDovent , @heisejiasuo96 , and @YangYou1991: 'Recurrent Diffusion for Large-Scale Parameter Generation' (RPG for short). Example: Obtain customized models using prompts (see below). (🧵1/8)
We are pleased to announce that PIXART-Σ is trained by our CAME optimizer(arxiv.org/abs/2307.02047). Glad to see that our work has a real-world impact on the training of DiT models! Code: github.com/yangluo7/CAME

Explore the blog post for a concise and insightful overview of the CAME optimizer! Congrats to the first author @YangYoungLL! 🔥 Blog: zhengzangw.github.io/blogs/came ArXiv: arxiv.org/abs/2307.02047 Code: github.com/huawei-noah/Pr… (a plug-and-play optimizer repo will be released soon)
🎉 We're thrilled and honored to have received the ACL Outstanding Paper Award! 🏆 Stay tuned, as we'll be publishing a repository for the CAME optimizer very soon. #ACL2023