Russ Tedrake
@RussTedrake
Professor at MIT, studying robotics. Vice President of Robotics Research, Toyota Research Institute.
This #RSS2024 on July 19, we are organizing a tutorial on supervised policy learning for real world robots! Talks by @notmahi & @RussTedrake will cover the fundamentals of imitation, recent algorithms, walk-through code, and practical considerations. supervised-robot-learning.github.io
Very proud of Nicholas, who recently shared scalable-real2sim.github.io (for physics-quality assets from a small amount of interaction with a robot) and is now following up with his work on scene-level generation.
Want to scale robot data with simulation, but don’t know how to get large numbers of realistic, diverse, and task-relevant scenes? Our solution: ➊ Pretrain on broad procedural scene data ➋ Steer generation toward downstream objectives 🌐 steerable-scene-generation.github.io 🧵1/8
New Paper: "Scalable Real2Sim: Physics-Aware Asset Generation via Robotic Pick-and-Place Setups"! 🤖 We introduce a fully automated pipeline that generates simulation-ready assets for real-world objects—no manual intervention needed! 🌐 Website: scalable-real2sim.github.io (1/5)⬇️
Announcing Diffusion Forcing Transformer (DFoT), our new video diffusion algorithm that generates ultra-long videos of 800+ frames. DFoT enables History Guidance, a simple add-on to any existing video diffusion models for a quality boost. Website: boyuan.space/history-guidan… (1/7)
I'm super excited to start a great new collaboration with the fantastic team at Boston Dynamics. Scott Kuindersma and I chatted with Evan Ackerman about it earlier today. spectrum.ieee.org/boston-dynamic…
Introducing Diffusion Forcing, which unifies next-token prediction (eg LLMs) and full-seq. diffusion (eg SORA)! It offers improved performance & new sampling strategies in vision and robotics, such as stable, infinite video generation, better diffusion planning, and more! (1/8)
And if you’re at all interested in humanoids you need to check out Punyo, from TRI, a soft-body humanoid capable of whole-body manipulation. A soft robot like this could be extremely helpful for in-home robots. Check out their medium post: medium.com/toyotaresearch…
a bit surprised TRIs new robot has gotten little love on X current humanoid approaches seem very hand centric but we use tactile sensing across our whole body to do tasks
Check out @chichengcc's step-by-step tutorial on building the UMI gripper. We really hope to see more UMIs running in the wild. 😊
We made a step-by-step video tutorial for building the UMI gripper! Please leave comments on @YouTube if you have any question youtu.be/x3ko0v_xwpg
Can we collect robot data without any robots? Introducing Universal Manipulation Interface (UMI) An open-source $400 system from @Stanford designed to democratize robot data collection 0 teleop -> autonomously wash dishes (precise), toss (dynamic), and fold clothes (bimanual)