Boyuan Chen
@Boyuan__Chen
Assistant Professor @DukeEngineering. Robotics and AI.
Our Perception Stitching has been accepted to TMLR 2024! Please check out our website for all information: generalroboticslab.com/PerceptionStit… If you are excited about cross-embodiment and cross-sensory robot transfer learning, please check them out! This is the follow-up work from our…
Perception Stitching: Zero-Shot Perception Encoder Transfer for Visuomotor Robot Policies Pingcheng Jian, Easop Lee, Zachary I. Bell, Michael M. Zavlanos, Boyuan Chen. Action editor: Emmanuel Bengio. openreview.net/forum?id=tYxRy… #visuomotor #imitation #trained
🚀 Just gave a talk at Stanford on building embodied intelligence from the ground up — now available to watch online! I discussed our lab's vision on the Arc of Embodied Intelligence, from a developmental cycle: 🧠 Sense, 🔁 Adapt, 🤝 Connect. I introduced some of our efforts…
New work from our lab: CREW-Wildfire 🌲🔥🤖 How scalable really are LLM-based multi-agent Agentic AI systems in dynamic, real-world tasks like wildfire response? Most benchmarks today are not ready to study this problem. Existing environments assume full observability, few…
When I was a PhD student, I wanted to use Eureqa, an incredible symbolic regression (SR) tool that could discover equations from data. Though it was developed by my PhD lab more than 15 years ago, it was closed-source and eventually commercialized. This stayed with me in…
Our interview on SonicSense with ASME is out! youtube.com/watch?v=LDwE0e… SonicSense enables robots to sense object shape, materials, and categories through acoustic vibrations. It was published on CoRL 2024. It's fully open sourced here: generalroboticslab.com/SonicSense Please also…
Text2Robot has been accepted to ICRA 2025. See you all soon! Welcome to a great future of automated design of robots!!! We are working hard to have robots walk out of 3D printers :) Please check out Duke’s press release of Text2Robot: pratt.duke.edu/news/text2robo…
Sharing our video on Text2Robot! One step closer towards our dream to have physical machines coming out of digital computers and 3D printers automatically. We automated robot design using foundation models like text-to-3D models, given only human text prompts. After a day, a…
I am excited to share that CREW has been accepted to Transactions on Machine Learning Research (TMLR 2024)! Please check out more information in our threads before. Check out more in Duke's press release: pratt.duke.edu/news/crew-ai-h… A launch pad for Human-AI teaming! Tyr it out in…
After two years' efforts, I am thrilled to share CREW, our latest fully open-source platform for Human-AI Teaming research! Big shoutout to our amazing team at Duke’s General Robotics Lab: Lingyu Zhang (@mkbzly) and Zhengran Ji (@Zhengran_Ji ). Discover why we’re so excited and…
Looking forward to talking #xenobots at the @UN's "AI For Good" Global Summit. Other speakers: @GeoffreyHinton, @Yoshua_Bengio, @DrMichaelLevin, Cecilia Laschi. aiforgood.itu.int/summit25/
This is absolutely wrong and unacceptable. It’s ironic that a researcher who discusses AI ethics publicly would make racist remarks at a top ML conference. Unethical and misleading statements have no place in academia or any field.
Mitigating racial bias from LLMs is a lot easier than removing it from humans! Can’t believe this happened at the best AI conference @NeurIPSConf We have ethical reviews for authors, but missed it for invited speakers? 😡
Love Boston? Love science? Then consider attending the AAAS annual meeting! I'll be talking about the AI side of xenobots, Feb 14, 11:30-12:30pm. meetings.aaas.org/registration/ aaas.confex.com/aaas/2025/meet…
GUIDE will be presented at NeurIPS 2024 next week! Please check out Duke's press release on our project: pratt.duke.edu/news/training-…
🚀 We’re thrilled to introduce GUIDE - our framework for real-time human-guided reinforcement learning, enabling continuous human feedback to teach AI agents faster and better. Accepted to #NeurIPS2024! Here’s what makes it special: Why GUIDE? Real-time decision-making is a…
🚀 We’re thrilled to introduce GUIDE - our framework for real-time human-guided reinforcement learning, enabling continuous human feedback to teach AI agents faster and better. Accepted to #NeurIPS2024! Here’s what makes it special: Why GUIDE? Real-time decision-making is a…

ClutterGen will also be presented at CoRL 2024 next week. Please come by and say hi!
We present ClutterGen that tackles two important research problems in robotics: 1) how to generate physically compliant cluttered scene for robot learning? 2) how to stably place objects instead of the current "pick and drop" paradigm? Our key idea is to formulate physically…
SonicSense will be presented at CoRL 2024 next week! We will also host a live demo at the demo sessions. Please check out Duke's press release on this work: pratt.duke.edu/news/sonicsens…
Sharing our video on SonicSense! We build a robot hand that can "see" by "hearing" the virbations. We now have both the software and hardware open sourced: github.com/generalrobotic… More on generalroboticslab.com/SonicSense
🎬 Exciting times at the Duke General Robotics Lab! This past week, our team took a well-deserved break with a private screening of The Wild Robot 🤖. This special gathering gives us a chance to unwind, connect, and reflect on the innovative intersection of nature and…


🚀 We’re thrilled to share our new research WildFusion: Multimodal Implicit 3D Reconstructions in the Wild. Our robot doesn’t just “see” with LiDAR and cameras—it feels, hears, and learns using tactile sensors, IMU data, and contact microphones. Combining these signals creates…

🚨**New Research Alert!**🚨 Imagine guiding a team of robots with just ONE person! Our latest work, HUMAC, shows how single-human guidance can unlock powerful multi-agent collaboration in tasks like hide-and-seek with autonomous robots. 🤖💡 The challenge: Collaboration is a…
We, at Duke General Robotics Lab, are thrilled to introduce The Duke Humanoid v1 — a fully open-source platform designed for dynamic and energy-efficient bipedal locomotion! 🤖👇 1. A fully open-source humanoid: We provide fully open-sourced hardware and software…
The Duke General Robotics Lab is hiring! We have two open positions: one Postdoctoral Researcher and one Research Associate (a full-time role for post-Bachelor or post-Master candidates). We’re looking for innovative thinkers with a passion for bold, unconventional ideas.…

After two years' efforts, I am thrilled to share CREW, our latest fully open-source platform for Human-AI Teaming research! Big shoutout to our amazing team at Duke’s General Robotics Lab: Lingyu Zhang (@mkbzly) and Zhengran Ji (@Zhengran_Ji ). Discover why we’re so excited and…