Mahi Shafiullah π π€
@notmahi
Trying to teach robots to do all my π chores at @CILVRatNYU @nyuniversity. Previously @MIT and visiting researcher at @MetaAI.
Robot Utility Models (RUMs) enable basic tasks β door opening, drawer opening, object reorientation, etc. β at ~90% accuracy without ANY finetuning (i.e. zero-shot) in unseen new environments. Fully open source!!! models, data, code & hw. We think this is super exciting, why?πβ¦
It is difficult to get robots to be both precise and general. We just released a new technique for precise manipulation that achieves millimeter-level precision while being robust to large visual variations. The key is a careful combination of visuo-tactile learning and RL. π§΅π
Generalization needs data. But data collection is hard for precise tasks like plugging USBs, swiping cards, inserting plugs, and keying locks. Introducing robust, precise VisuoTactile Local (ViTaL) policies: >90% success rates from just 30 demos and 45 min of real-world RL.π§Άβ¬οΈ
Full episode dropping soon! Geeking out with @vincentjliu @AdemiAdeniji on EgoZero: Robot Learning from Smart Glasses egozero-robot.github.io Co-hosted by @chris_j_paxton @micoolcho
E-flesh is one of the extremely futuristic projects my labmates have been working on that makes me feel hope for the future of accessible robot learning β yes, we can just 3D print some arbitrary shaped tactile sensors, and it's actually surprisingly robust and immensely useful!
We have developed a new tactile sensor, called e-Flesh, with a simple working principle: measure deformations in 3D printable microstructures. Now all you need to make tactile sensors is a 3D printer, magnets, and magnetometers! π§΅
It was nice engaging with the CV community on ways to stand out in the crowd. My answer was simple: work on robotics. There are so many unanswered problems and open pastures for research if you are a new researcher. Below are 6 problems I focussed on in my talk.
In this #CVPR2025 edition of our community-building workshop series, we focus on supporting the growth of early-career researchers. Join us tomorrow (Jun 11) at 12:45 PM in Room 209 Schedule: sites.google.com/view/standoutcβ¦ We have an exciting lineup of invited talks and candidβ¦
RUMs is at #CVPR2025 today. Check it out in Hall B and bring objects of your own to test the generality of our policies!
Live demo-ing RUMs at @CVPR this afternoon next to the expo sessions β stop by with something small and letβs see if the robot can pick it up zero shot! #CVPR2025
We had an incredible time showcasing everything Stretch 3 is capable of at the AI for Good Summit! It was a pleasure to be joined by the talented team from the NYU GRAIL lab, who demonstrated their cutting-edge work on Robot Utility Models. learn more at: robotutilitymodels.com
Current robot policies often face a tradeoff: they're either precise (but brittle) or generalizable (but imprecise). We present ViTaL, a framework that lets robots generalize precise, contact-rich manipulation skills across unseen environments with millimeter-level precision. π§΅
Second talk is happening. Mahi Shafiullah is talking about generalization in robotics.
Weβre starting our panel session now! I know the outdoors environment βοΈat the poster session is very fun β but the discussion you will hear here may be even more interesting!
Workshop on Mobile Manipulation in #RSS2025 kicking off with a talk from @leto__jean! Come by EEB 132 if youβre here in person, or join us on Zoom (link on the website)
Come to join our workshop! Map for in-person participants and Zoom for online interaction: stanford.zoom.us/j/92067016723?β¦
Workshop on Mobile Manipulation in #RSS2025 kicking off with a talk from @leto__jean! Come by EEB 132 if youβre here in person, or join us on Zoom (link on the website)
Join our #RSS2025 MoMa Workshop tomorrow to hear about latest advancements and challenges in mobile manipulation. π° Learn more: rss-moma-2025.github.io π Also available on Zoom!
The #RSS2025 Workshop on Continual Robot Learning from Humans is happening on June 21. We have an amazing lineup of speakers discussing how we can enable robots to acquire new skills and knowledge from humans continuously. Join us in person and on Zoom (info on our website)!
Weβre also organizing a workshop on Mobile Manipulation at #RSS2025 with a fun set of speakers! Mobile manipulation is one of the more open research questions today on robot learning β come to hear what our speakers make of it, see the posters, or even β¨a real robot demo!β¨

Iβll be talking at the #RSS2025 workshop on Continual Robot Learning from Humans from 9:30-10 today! Iβll be talking about why a billion dollars may not be needed for general learned robots β and how weβre approaching this within academia with the limited budget and resources π¦Ύ
The #RSS2025 Workshop on Continual Robot Learning from Humans is happening on June 21. We have an amazing lineup of speakers discussing how we can enable robots to acquire new skills and knowledge from humans continuously. Join us in person and on Zoom (info on our website)!
Making touch sensors has never been easier! Excited to present eFlesh, a 3D printable tactile sensor that aims to democratize robotic touch. All you need to make your own eFlesh is a 3D printer, some magnets and a magnetometer. See thread πand visit e-flesh.com
Tactile sensing is gaining traction, but slowly. Why? Because integration remains difficult. But what if adding touch sensors to your robot was as easy as hitting βprintβ? Introducing eFlesh: a 3D-printable, customizable tactile sensor. Shape it. Size it. Print it. π§Άπ
Tactile sensing is gaining traction, but slowly. Why? Because integration remains difficult. But what if adding touch sensors to your robot was as easy as hitting βprintβ? Introducing eFlesh: a 3D-printable, customizable tactile sensor. Shape it. Size it. Print it. π§Άπ
Tactile sensing is gaining traction, but slowly. Why? Because integration remains difficult. But what if adding touch sensors to your robot was as easy as hitting βprintβ? Introducing eFlesh: a 3D-printable, customizable tactile sensor. Shape it. Size it. Print it. π§Άπ