Chenxiao Yang
@chenxiao_yang_
PhD @ TTIC, Working on LLM Reasoning, ML/DL Theory, Graph Learning
How to scale test-time computes in a provably optimal way? Here’s a simple solution. Check our #ICML2025 paper "PENCIL: Long Thoughts with Short Memory" 📍 Poster # E-2304 📅 Thu, Jul 16 | 🕓 4:30–7:00
I've discovered a truly marvelous idea for building AGI, but Twitter's space limit won't let me explain it! Damn! 😫 Introducing ✏️PENCIL, a new LLM reasoning paradigm that generates and erases thoughts, enabling longer and deeper thinking with shorter context. #ICML2025 🧵1/n…
How can 🔥graph foundation models🔥 handle topological shifts? Our #icml25 paper presents a physics-inspired Transformer that offers provable generalization capability and substantial improvements across diverse graph tasks with @chenxiao_yang_ , Kaipeng Zeng, @mmbronstein
"In this article, we challenge the conventional 'write-only' CoT reasoning paradigm that dominates current LLM architectures, from both theoretical and practical perspectives." Chenxiao Yang presents insights based on their recent paper. towardsdatascience.com/empowering-llm…
How exactly GNNs learn from graph structures? Is it possible to develop much more parsimonious models with similar generalization performance? Check our #ICML2024 paper "How Graph Neural Networks Learn: Lessons from Training Dynamics" Poster: #506, Thu, Jul. 25, 13:30-15:00

Generalization under distribution shifts is a major desideratum for graph learning. How can we achieve so in principled ways? #ICML2024 Check our poster at #507, Wed, July 24 (13:30-15:00), and this Blog: towardsdatascience.com/towards-genera…
In a fascinating new post, @mmbronstein, @qitianwu_, and @Chenxia58917359 walk us through their latest research into a new diffusion-based continuous GNN model that offers better generalization capabilities. buff.ly/4057cyH