Rishabh Dabral
@rishabh_dabral
Research Group Leader at MPI-Informatics
{1/8} 🧵 When you click a link, have you ever wondered: “Which webpage is actually important?” Google answered that with PageRank—treating the web as a Markov chain. Now imagine doing the same… but for transformer attention.👇 🔗 yoterel.github.io/attention_chai…
🏃Today at CVPR!!! 📅🕐 13:00–17:00, 📍Room 110B 💃
🎉 The #CVPR HuMoGen Workshop is happening TODAY afternoon! We’ll be featuring an exciting lineup of invited talks and poster presentations covering cutting-edge work in generative modeling of 3D and 2D human motion. If you’re working in this space, you won’t want to miss it!
Full body tracking from a VR headset? In our #CVPR2025 Highlight paper “FRAME: Floor-aligned Representation for Avatar Motion from Egocentric Video”, we show it’s possible using only body-facing cameras. 🧵...
#CVPR2025 Want to put life into your LLM chatbots with gesticulating characters? Checkout @mhamzamughal09's work on generating semantically meaningful gestures using RAG with a gesture diffusion model. Congratulations to Hamza and collaborators at @VcaiMpi and @neuroexplicit !
Code and models coming soon! Paper: arxiv.org/abs/2412.06786 Videos with explanation: vcai.mpi-inf.mpg.de/projects/RAG-G… Many thanks to the collaborators: @rishabh_dabral Merel Scholman @vdemberg Christian Theobalt @neuroexplicit
One week before the HuMoGen deadline. The submission portal is open at: openreview.net/group?id=thecv…
We invite you to submit your Motion Generation papers to the HuMoGen @CVPR workshop! The deadline is on March 12 More details @ humogen.github.io
🚀 Check out our #CVPR2025 paper BimArt! 👐🔧 We generate 3D bimanual interactions with articulated objects by: ✅ Predicting contact maps from object trajectories ✅ Using an articulation-aware feature representation Project Webpage: vcai.mpi-inf.mpg.de/projects/bimar… @VcaiMpi
Now hiring: Multiple PhD students to start in fall 2025, for research on combining neural and symbolic/interpretable models of language, vision, and action. Work with world-class advisors at @Saar_Uni, MPI Informatics, @mpi_sws, @CISPA ,@DFKI. Details: neuroexplicit.org/jobs/
Human Motion people - There is still a path to Nashville...Just saying @CVPR
We invite you to submit your Motion Generation papers to the HuMoGen @CVPR workshop! The deadline is on March 12 More details @ humogen.github.io
We're back! 🚶♀️🕺 The Human Motion Generation Workshop returns to @CVPR with an amazing lineup of speakers from academia and industry. 🎤✨ 📢 Call for Papers and details: humogen.github.io @anorangeduck @liuziwei7 Yuting Ye, and Taku Komura.
[1/6]🧵: Our project CasperDPM --- A method to augment hands in real time, has been accepted to #SIGGRAPHAsia2024 ! Project Page: yoterel.github.io/casper-project… -->
The full recording of the workshop is now available 📽️📽️📽️ humogen.github.io youtube.com/watch?v=lkQ4sD…
🌟 A huge thank you to our incredible speakers, poster presenters, and the enthusiastic audience for making this event a success! 🙏👏 @CVPR #HuMoGen
#ECCV2024 Presenting ReMoS, our latest work on Reactive Motion Synthesis for modeling 2-person interactions. Project page: vcai.mpi-inf.mpg.de/projects/remos/ Team: @ghosh_ani12, @rishabh_dabral, @VGolyanik, C. Theobalt, P. Slusallek
How can we meta-learn a prior for human radiance and SDF fields to tackle ambiguities in the sparse (few camera) performance capture setting? Check out our #ECCV 2024 work MetaCap! Project page: vcai.mpi-inf.mpg.de/projects/MetaC… Team: @GuoxingSun, @rishabh_dabral, @FuaPv, Christian
🌟 A huge thank you to our incredible speakers, poster presenters, and the enthusiastic audience for making this event a success! 🙏👏 @CVPR #HuMoGen
Starting now in Room 430, Summit Building. @humogen11384

Coming up tomorrow (Tue) morning🤩🤩🤩 Summit 430 - starting at 8:30 AM
HuMoGen @CVPR is happening next week🔥🔥🔥 [Tue, June 18, from 8:30 AM to 1:00 PM @ Summit 430] Don't miss out our great speakers and 21 accepted papers!! Full schedule @ humogen.github.io
Paper decisions are now available at OpenReview🕺
We invite you to submit your Motion Generation papers to the HuMoGen @CVPR workshop! The deadline is on March 19 More details @ humogen.github.io
[1/2] Excited about your paper being accepted to CVPR? Looking to extend its impact to more movers and shakers in the Human Motion Generation community? 🕺💡 Submit a one-page abstract to the HuMoGen Workshop at #CVPR2024 and reach an even broader audience! 📝✨ @CVPR #HuMoGen
Introducing our latest work on multi-party gesture synthesis, accepted in #CVPR2024. We take a step toward controllable gesture synthesis. What's more, we introduce the Dungeons & Dragons (DnD) dataset: 5 participants playing DnD for hours and, well, naturally gesticulating.
#CVPR2024 We present ConvoFusion - a diffusion-based gesture synthesis approach for monadic and dyadic gesture synthesis which also allows for user controllability. Project Page: vcai.mpi-inf.mpg.de/projects/Convo… Details below.
The HuMoGen @CVPR deadline is in 6 days🙀🙀🙀 We invite you to submit your #MOTION papers! More details @ humogen.github.io