Xiaolong Wang
@xiaolonw
Associate Professor @UCSDJacobs Postdoc @berkeley_ai PhD @CMU_Robotics
After one year of effort from @RchalYang , EgoVLA is finally released. I think about this as training cross-embodiment VLA, with human as just another kind of robot. EgoVLA essentially provides a smart way to align the action space from different embodiments for dexterous…
How can we leverage diverse human videos to improve robot manipulation? Excited to introduce EgoVLA — a Vision-Language-Action model trained on egocentric human videos by explicitly modeling wrist & hand motion. We build a shared action space between humans and robots, enabling…
this is really cool and readily popular / reproducible enough to the point i’m finding people making the sensor in the depths of chinese wechat groups the attention to reproducibility from Lerrel’s lab is incredible and something I often strive to achieve
We have developed a new tactile sensor, called e-Flesh, with a simple working principle: measure deformations in 3D printable microstructures. Now all you need to make tactile sensors is a 3D printer, magnets, and magnetometers! 🧵
We have spent nearly a year from building this work to the time of a solid release. And it does accurately working towards building effective and efficient pipeline for DexHand data, which is one of the core problem for DexHand manipulation. Congrats to the team, and especially…
How to generate billion-scale manipulation demonstrations easily? Let us leverage generative models! 🤖✨ We introduce Dex1B, a framework that generates 1 BILLION diverse dexterous hand demonstrations for both grasping 🖐️and articulation 💻 tasks using a simple C-VAE model.
We finally released Dex1B #RSS2025! Our group has been working on sim2real dexterous manipulation for the last 4 years and this is a milestone showing a grasping policy purely trained in sim with 1B data, and transfer to real world deployment without any fine tuning. The devils…
How to generate billion-scale manipulation demonstrations easily? Let us leverage generative models! 🤖✨ We introduce Dex1B, a framework that generates 1 BILLION diverse dexterous hand demonstrations for both grasping 🖐️and articulation 💻 tasks using a simple C-VAE model.
The generative model faces challenges in feasibility🔧(lower success rate compared to deterministic models) and diversity🙌(tend to interpolate rather than extrapolate). We address these by adding geometric constraints--SDF loss, and using a locally conditioned model🤖.
How to generate billion-scale manipulation demonstrations easily? Let us leverage generative models! 🤖✨ We introduce Dex1B, a framework that generates 1 BILLION diverse dexterous hand demonstrations for both grasping 🖐️and articulation 💻 tasks using a simple C-VAE model.
How to generate billion-scale manipulation demonstrations easily? Let us leverage generative models! 🤖✨ We introduce Dex1B, a framework that generates 1 BILLION diverse dexterous hand demonstrations for both grasping 🖐️and articulation 💻 tasks using a simple C-VAE model.
How can we leverage diverse human videos to improve robot manipulation? Excited to introduce EgoVLA — a Vision-Language-Action model trained on egocentric human videos by explicitly modeling wrist & hand motion. We build a shared action space between humans and robots, enabling…
📹Recording now available! If you missed our workshop at RSS, you can now watch the full session here: youtu.be/7a5HYjQ4wJo?si… Thanks again to all the speakers and participants!
We are excited to host the 3rd Workshop on Dexterous Manipulation at RSS tomorrow! Join us at OHE 122 starting at 9:00 AM! See you there!
Tactile interaction in the wild can unlock fine-grained manipulation! 🌿🤖✋ We built a portable handheld tactile gripper that enables large-scale visuo-tactile data collection in real-world settings. By pretraining on this data, we bridge vision and touch—allowing robots to:…
One billion Ability Hand demonstrations! 🤯 Huge thanks to Dr. Xiaolong Wang, his team at UCSD, and the robotics community for showcasing the Ability Hand in groundbreaking work at RSS 2025. @xiaolonw @UCSanDiego @PSYONICinc
Missed our RSS workshop? Our recordings are online: youtube.com/@hardware-awar…. All talks were awesome, and we had a very fun panel discussion session 🧐 Huge thanks to our organizers for all the hard work @haqhuy @XiaomengXu11 @s_zhanyi @ma_yuxiang @xiaolonw Mike Tolley
Enjoying the first day of #RSS2025? Consider coming to our workshop 🤖Robot Hardware-Aware Intelligence on Wed! @RoboticsSciSys Thank you to everyone who contributed 🙌 We'll have 16 lightning talks and 11 live demos! More info: rss-hardware-intelligence.github.io
AMO live demo at #RSS2025 ! 👉amo-humanoid.github.io
Meet 𝐀𝐌𝐎 — our universal whole‑body controller that unleashes the 𝐟𝐮𝐥𝐥 kinematic workspace of humanoid robots to the physical world. AMO is a single policy trained with RL + Hybrid Mocap & Trajectory‑Opt. Accepted to #RSS2025. Try our open models & more 👉…
3 years of dexterous manipulation workshops down the road since 2023: learn-dex-hand.github.io/rss2023/. Great to see the progress in the field.
Excited to be organizing the dexterous manipulation workshop at #RSS2025 — great energy and lots of interest in dexterous manipulation! dex-manipulation.github.io/rss2025/. Come by in OHE 122!
Excited to be organizing the dexterous manipulation workshop at #RSS2025 — great energy and lots of interest in dexterous manipulation! dex-manipulation.github.io/rss2025/. Come by in OHE 122!
We are excited to host the 3rd Workshop on Dexterous Manipulation at RSS tomorrow! Join us at OHE 122 starting at 9:00 AM! See you there!
An upgrade on ACE with a cleaner design, force feedback, and gravity compensation.
🚀 Meet ACE-F — a next-gen teleop system merging human and robot precision. Foldable, portable, cross-platform — it enables 6-DoF haptic control for force-aware manipulation. 🦾 See our demo & talk at the Robot Hardware-Aware Intelligence workshop this Wed @RoboticsSciSys!
🚀 Meet ACE-F — a next-gen teleop system merging human and robot precision. Foldable, portable, cross-platform — it enables 6-DoF haptic control for force-aware manipulation. 🦾 See our demo & talk at the Robot Hardware-Aware Intelligence workshop this Wed @RoboticsSciSys!
Enjoying the first day of #RSS2025? Consider coming to our workshop 🤖Robot Hardware-Aware Intelligence on Wed! @RoboticsSciSys Thank you to everyone who contributed 🙌 We'll have 16 lightning talks and 11 live demos! More info: rss-hardware-intelligence.github.io
📣 Heading to #RSS2025? Come see it in action! 🎯 Live demo: Wednesday, June 25, 3:30 PM @ EEB 248 Workshop: Robot Hardware-Aware Intelligence We’ll be showcasing our data-driven, co-designed soft gripper 🥢 — in real-time!
🚀Heading to #RSS2025? Swing by EEB 248 on Wednesday, June 25 at 3:30 PM for a live demo of our data-driven, co-design soft gripper 🥢 at the workshop Robot Hardware-Aware Intelligence!
We are presenting hardware and policy co-design in the coming RSS workshop!
🚀Heading to #RSS2025? Swing by EEB 248 on Wednesday, June 25 at 3:30 PM for a live demo of our data-driven, co-design soft gripper 🥢 at the workshop Robot Hardware-Aware Intelligence!