Elgce
@BenQingwei
Ph.D. at MMLab, CUHK http://www.qingweiben.com
🫰Thrilled to introduce HOMIE: Humanoid Loco-Manipulation with Isomorphic Exoskeleton Cockpit. Website: homietele.github.io Code: github.com/OpenRobotLab/O… YouTube: youtu.be/FxkGmjyMc5g 😀 HOMIE consists of a novel RL-based training framework and a self-designed hardware…
Amazing work! An impressive progress towards Humanoid Scene Interaction!
🚀 Introducing LeVERB, the first 𝗹𝗮𝘁𝗲𝗻𝘁 𝘄𝗵𝗼𝗹𝗲-𝗯𝗼𝗱𝘆 𝗵𝘂𝗺𝗮𝗻𝗼𝗶𝗱 𝗩𝗟𝗔 (upper- & lower-body), trained on sim data and zero-shot deployed. Addressing interactive tasks: navigation, sitting, locomotion with verbal instruction. 🧵 ember-lab-berkeley.github.io/LeVERB-Website/
RoboDuet has been accepted by IROS for presentation! See you in Hangzhong this October! 😭Really eager to be there in person
After a long time, my first research project of my life, RoboDuet, has finally been accepted by RAL! This is really inspiring for me. RoboDuet is fully open-sourced, including training code and deployment code. If you're interested in it, just have a try!
Just a temporary helper today, but very excited to join the HOMIE squad! 👊 Here to support my amazing friends Qingwei @BenQingwei (online 👀) and Feiyu @Jia_Fei_Yu 💪 Don’t miss HOMIE at #RSS2025 — and don’t miss the chance to chat with Qingwei👇 WeChat QR here!
HOMIE will be presented at #RSS2025 today! Spotlight Talks: 4:30pm-5:30pm Poster: 6:30pm-8:00pm BoardNr: 34 @li_yitang will be there to help us present this paper And I will be online to introduce and discuss it🥳 Talk video: drive.google.com/file/d/10uYskZ…
HOMIE will be presented at #RSS2025 today! Spotlight Talks: 4:30pm-5:30pm Poster: 6:30pm-8:00pm BoardNr: 34 @li_yitang will be there to help us present this paper And I will be online to introduce and discuss it🥳 Talk video: drive.google.com/file/d/10uYskZ…
🫰Thrilled to introduce HOMIE: Humanoid Loco-Manipulation with Isomorphic Exoskeleton Cockpit. Website: homietele.github.io Code: github.com/OpenRobotLab/O… YouTube: youtu.be/FxkGmjyMc5g 😀 HOMIE consists of a novel RL-based training framework and a self-designed hardware…
Today, we're unveiling the RL core that makes Redwood possible—a real-world reinforcement learning system that seamlessly blends fundamental behaviors like getting up, sitting, squatting, kneeling, walking, and running into one fluid, controllable…
🤖Can we build a generalized robot navigation policy without any real-robot data? 👏We introduce the NavDP, which can zero-shot adapt to different robots in the open world. Website: wzcai99.github.io/navigation-dif… Github: github.com/wzcai99/NavDP/ Arxiv: arxiv.org/abs/2505.08712
Squatting / Climbing stairs / Whole-body Control This is something just w/ github.com/OpenRobotLab/O… But w/ new perception way Maybe we can dive deeper into what complex environments humanoid can deal with (I mean 100% SR) and distinguish them from non-legged ones Stay tuned :)
After a long time, my first research project of my life, RoboDuet, has finally been accepted by RAL! This is really inspiring for me. RoboDuet is fully open-sourced, including training code and deployment code. If you're interested in it, just have a try!
We released RoboDuet, a novel framework which employs two collaborative policies to realize locomotion and manipulation simultaneously. Policies trained by RoboDuet enable zero-shot deployment across different quadruped robots in the real world.
Spent half an hour to modify the training code of HOMIE (homietele.github.io) to train another kind of full-size humanoid, Unitree H1-2, which is much heavier than Unitree G1. Without some modifications to the reward and training after less than 2 hours, I just achieve a…
Totally agreed! You should first achieve best teleoperation first, then you can have abundant high-quality data to train robots. Thus, teleoperation is a key point in the way to our final goals, as well as some algorithms to train autonomous models. You cannot say which one is…
Nice results from @1x_tech. TBH I don't really care about whether the loco-mani demos from this video are via teleop or not, because either way it is super impressive. Many people misunderstood "teleoperation" for humanoids and other mobile manipulators. In a nutshell,…
Amazing! This result is so impressive! Could this be a milestone for humanoid robots?🦾
My team has been working for a year trying to solve intelligence for the home Meet Helix 🧬: the first Humanoid Vision-Language-Action model This is one Helix neural network running on 2 Figure robots at once They've never seen any of these items before
Really impressive! Loco-manipulation data collection is not so far now.
🫰Thrilled to introduce HOMIE: Humanoid Loco-Manipulation with Isomorphic Exoskeleton Cockpit. Website: homietele.github.io Code: github.com/OpenRobotLab/O… YouTube: youtu.be/FxkGmjyMc5g 😀 HOMIE consists of a novel RL-based training framework and a self-designed hardware…