Robotic Systems Lab
@leggedrobotics
The Robotic Systems Lab designs machines, creates actuation principles, and builds up control technologies for autonomous operation in challenging environments.
Beyond automation and precision, A Robot’s Dream asked something deeper: What does it mean for a robot to express—to participate in art, to reflect us back to ourselves? Proud to support this exploration at #LaBiennale by Ryan Batke and @breadli428. gramaziokohler.arch.ethz.ch/web/projekte/e…
Best Systems Paper finalist at #RSS2025 🚀 Excited to share our work on a perceptive forward dynamics model for safe, platform-aware navigation. 📄 arxiv.org/pdf/2504.19322 🌐 leggedrobotics.github.io/fdm.github.io/ #Robotics #Planning #RSS
A legged mobile manipulator trained to play badminton with humans coordinates whole-body maneuvers and onboard perception. Paper: science.org/doi/10.1126/sc……Video: youtu.be/zYuxOVQXVt8 @Yuntao144, Andrei Cramariuc, Farbod Farshidian, Marco Hutter
ANYmal just learned parkour in the wild! 9 expert skills → 1 foundation policy → RL fine-tuning → any terrain with a single end-to-end policy from cameras to motor control. Paper: arxiv.org/abs/2505.11164 Watch: youtu.be/QDU_FicBPDo @rdn_nikita Junzhe He Joshua Aurand
“A Robot’s Dream” explores the relationship between automation, craftsmanship, and materiality in architecture. Ryan Batke and Chenhao Li (@breadli428) from RSL contributed to the project by supporting motion imitation for a humanoid robot. Read more gramaziokohler.arch.ethz.ch/web/projekte/e….
"A Robot’s Dream", a project led by Gramazio Kohler Research, will be exhibited at La Biennale Architettura 2025 @la_Biennale in Venice. Dongho Kang (@eastskykang) and Yijiang Huang from CRL contributed to the project by supporting motion generation for a humanoid robot.
Our robot puppy dances, walks, and tracks gaze—all at once. At #ICRA2025, we’re presenting Deep Fourier Mimic, a generalization of DeepMimic for expressive, multitask motion control. Come chat with us about smooth transitions, frequency modulation, and dancing robots!
🏙️ This week in Atlanta, we are presenting our work on Deep Fourier Mimic @ieee_ras_icra! 🧠 Free yourself from training GANs in learning from demonstrations and learn from MoCap data with ease and pleasure! 🤖 Come and talk with us!!! sony.github.io/DFM/ #ICRA2025
Check out our #ICRA2025 paper ! Robust, precise, and terrain-aware—our RL controller significantly improves over baselines for whole-body 6-DoF tracking.💥 Project website: leggedrobotics.github.io/wholebody-pose… Work led by: @TifannyPortela, Andrei Cramariuc, Mayank Mittal and Marco Hutter
Congratulation to @JonasFrey96 for a successful defense on "Learning Perception and Navigation for autonomous robots in the wild"! @eth_dmavt @ETH_en thanks @MauriceFallon @GMartius @NASAJPL

Congrats to Yunao Ma for successfully defending his PhD "Towards Agile Whole-Body Legged Loco-Manipulation". Can't wait to play badminton against #ANYmal. @eth_dmavt @ETH_en @GMartius

Congratulations to @ki_ki_ki1, master of perceptive RL-based legged locomotion, for a successful defense of the thesis “Bridging Perception and Control for Legged Locomotion and Navigation in the Wild".@ETH_en @eth_dmavt @pulkitology

We introduce Deep Fourier Mimic, a generalized version of DeepMimic, which enables automatic parameterization of reference motions. This means you can learn diverse motions with a single policy conditioned on their meaningful spatial and temporal representations! #ICRA2025
Check out our #ICRA2025 paper where we train a robotic puppy to dance expressively! Our method, Deep Fourier Mimic, a generalized version of DeepMimic, enables automatic parameterization of reference data. Project site: sony.github.io/DFM/ Video: youtube.com/watch?v=Do4HmC… 🧵