Jimmy Wu
@jimmyyhwu
CS PhD student @Princeton. Robot learning and computer vision.
When will robots help us with our household chores? TidyBot++ brings us closer to that future. Our new open-source mobile manipulator makes it more accessible and practical to do robot learning research outside the lab, in real homes!
Haoyu built an awesome bimanual + neck robot which can be easily mounted on the TidyBot++ mobile base. Hardware design is fully open source! Check out his thread below to learn more 👇
Your bimanual manipulators might need a Robot Neck 🤖🦒 Introducing Vision in Action: Learning Active Perception from Human Demonstrations ViA learns task-specific, active perceptual strategies—such as searching, tracking, and focusing—directly from human demos, enabling robust…
Today we're excited to share a glimpse of what we're building at Generalist. As a first step towards our mission of making general-purpose robots a reality, we're pushing the frontiers of what end-to-end AI models can achieve in the real world. Here's a preview of our early…
How can we move beyond static-arm lab setups and learn robot policies in our messy homes? We introduce HoMeR, an imitation learning agent for in-the-wild mobile manipulation. 🧵1/8
🤖 Ever dreamed of controlling a humanoid robot to perform complex, long-horizon tasks — using just a single Vision Pro? 🎉 Meet CLONE: a holistic, closed-loop, whole-body teleoperation system for long-horizon humanoid control! 🏃♂️🧍 CLONE enables rich and coordinated…
Introducing Mobi-π: Mobilizing Your Robot Learning Policy. Our method: ✈️ enables flexible mobile skill chaining 🪶 without requiring additional policy training data 🏠 while scaling to unseen scenes 🧵↓
🤖 Can a humanoid robot hold extreme single-leg poses like Bruce Lee's Kick or the Swallow Balance? 🤸 💥 YES. Meet HuB: Learning Extreme Humanoid Balance 🔗 Project website: hub-robot.github.io
🤖Introducing TWIST: Teleoperated Whole-Body Imitation System. We develop a humanoid teleoperation system to enable coordinated, versatile, whole-body movements, using a single neural network. This is our first step toward general-purpose robots. 🌐humanoid-teleop.github.io
Meet 𝐀𝐌𝐎 — our universal whole‑body controller that unleashes the 𝐟𝐮𝐥𝐥 kinematic workspace of humanoid robots to the physical world. AMO is a single policy trained with RL + Hybrid Mocap & Trajectory‑Opt. Accepted to #RSS2025. Try our open models & more 👉…
Low-cost teleop systems have democratized robot data collection, but they lack any force feedback, making it challenging to teleoperate contact-rich tasks. Many robot arms provide force information — a critical yet underutilized modality in robot learning. We introduce: 1. 🦾A…
Time to democratize humanoid robots! Introducing ToddlerBot, a low-cost ($6K), open-source humanoid for robotics and AI research. Watch two ToddlerBots seamlessly chain their loco-manipulation skills to collaborate in tidying up after a toy session. toddlerbot.github.io
The ultimate test of any physics simulator is its ability to deliver real-world results. With MuJoCo Playground, we’ve combined the very best: MuJoCo’s rich and thriving ecosystem, massively parallel GPU-accelerated simulation, and real-world results across a diverse range of…
I first learned the term "holonomic" during a conversation with @jimmyyhwu while experimenting with the AgileX Ranger Mini. Despite being omnidirectional, the Ranger Mini isn't holonomic. Initially, I thought this distinction didn't matter—until I tried to make the base follow…
When will robots help us with our household chores? TidyBot++ brings us closer to that future. Our new open-source mobile manipulator makes it more accessible and practical to do robot learning research outside the lab, in real homes!
what's the move? a question i ask the group chat when i’m bored... and now, my robot too. enter SPHINX: a hybrid IL agent that dynamically selects its action space (waypoints / dense actions) and input type (point clouds / wrist images) for precise, generalizable manip! 🧵 1/7
Love the open-source holonomic base design! It enables more natural movements in demonstrations and autonomous execution.
When will robots help us with our household chores? TidyBot++ brings us closer to that future. Our new open-source mobile manipulator makes it more accessible and practical to do robot learning research outside the lab, in real homes!
I have long awaited a holomonic mobile manipulator with compliant arm! It was incredible to witness these demos in @jimmyyhwu ’s apartment
When will robots help us with our household chores? TidyBot++ brings us closer to that future. Our new open-source mobile manipulator makes it more accessible and practical to do robot learning research outside the lab, in real homes!
In early 2023, I explored over 20 companies worldwide but could not find a robot base that met my research requirements: 1.Small Size (50cm) but High Payload: Capable of supporting any robot arm. 2.Holonomic Mobility: Able to move freely in all directions, like a floating office…
When will robots help us with our household chores? TidyBot++ brings us closer to that future. Our new open-source mobile manipulator makes it more accessible and practical to do robot learning research outside the lab, in real homes!
Robot manipulation research needs a mobile base. Although we see lots of exciting tabletop manipulation demos, since the scene is setup by humans, it could be set in a way that is easier for the robot to reach. Open world manipulation needs to be mobile
When will robots help us with our household chores? TidyBot++ brings us closer to that future. Our new open-source mobile manipulator makes it more accessible and practical to do robot learning research outside the lab, in real homes!