Lerrel Pinto
@LerrelPinto
Trying to make robots more dexterous, robust and general @nyuniversity
We have developed a new tactile sensor, called e-Flesh, with a simple working principle: measure deformations in 3D printable microstructures. Now all you need to make tactile sensors is a 3D printer, magnets, and magnetometers! 🧵
i got to play with the e-flesh and robotic hand at Open Sauce! these low cost open source designs made me so excited for all the new humanoid research and hobbyist projects they can be used in!!
We have developed a new tactile sensor, called e-Flesh, with a simple working principle: measure deformations in 3D printable microstructures. Now all you need to make tactile sensors is a 3D printer, magnets, and magnetometers! 🧵
Is "scaling is all you need" the right path for robotics? Announcing our @corl_conf workshop on "Resource-Rational Robot Learning", where we will explore how to build efficient intelligent systems that learn & thrive under real-world constraints. Submission deadline: Aug 8 🧵
Nice to see RUMs being live-demoed at the @UN AI for Good Summit in Geneva. You can pretty much drop any object in front of this robot and it can semi-reliably pick it up. This general-purpose model and associated robot tools will be open-sourced soon!
We had an incredible time showcasing everything Stretch 3 is capable of at the AI for Good Summit! It was a pleasure to be joined by the talented team from the NYU GRAIL lab, who demonstrated their cutting-edge work on Robot Utility Models. learn more at: robotutilitymodels.com
We had an incredible time showcasing everything Stretch 3 is capable of at the AI for Good Summit! It was a pleasure to be joined by the talented team from the NYU GRAIL lab, who demonstrated their cutting-edge work on Robot Utility Models. learn more at: robotutilitymodels.com
Robot collecting robot data. Having fun with this awesome data collection tool from @notmahi ! *Auto reset at the end.
Robots no longer have to choose between being precise or adaptable. This new method gives them both! [📍 Bookmark Paper & Code] ViTaL is a new method that teaches robots precise, contact-rich tasks that work in any scene. From your cluttered kitchen to a busy factory floor.…
A nice pipeline: use a VLM to find objects in scene, get close, and use a well-constrained visuo-tactile policy to handle the last inch.
Current robot policies often face a tradeoff: they're either precise (but brittle) or generalizable (but imprecise). We present ViTaL, a framework that lets robots generalize precise, contact-rich manipulation skills across unseen environments with millimeter-level precision. 🧵
🚀 With minimal data and a straightforward training setup, our VisualTactile Local Policy (ViTaL) fuses egocentric vision + tactile feedback to achieve millimeter-level precision & zero-shot generalization! 🤖✨ Details ▶️ vitalprecise.github.io
Thanks @Stone_Tao ! Glad to see this sensor being made by the community!
this is really cool and readily popular / reproducible enough to the point i’m finding people making the sensor in the depths of chinese wechat groups the attention to reproducibility from Lerrel’s lab is incredible and something I often strive to achieve
Egocentric data collection can allow us to scale robot data so much more quickly than other methods of data collection.
Full episode dropping soon! Geeking out with @vincentjliu @AdemiAdeniji on EgoZero: Robot Learning from Smart Glasses egozero-robot.github.io Co-hosted by @chris_j_paxton @micoolcho
Glad to see this line of work that brings robots closer to feeding humans who need assistance being recognized at RSS 2025. Congrats @TapoBhat and team!
And we won the #RSS 2025 Best Paper Award! Congrats @rkjenamani and the entire @EmpriseLab team @CornellCIS 🎉🎉
This looks cool! Congratulations @Vikashplus and team.
All forms of intelligence co-emerged with a body, except AI We're building a #future where AI evolves as your lifelike digital twin to assist your needs across health, sports, daily life, creativity, & beyond... myolab.ai ➡️ Preview your first #HumanEmbodiedAI
I’ll be talking at the #RSS2025 workshop on Continual Robot Learning from Humans from 9:30-10 today! I’ll be talking about why a billion dollars may not be needed for general learned robots – and how we’re approaching this within academia with the limited budget and resources 🦾
The #RSS2025 Workshop on Continual Robot Learning from Humans is happening on June 21. We have an amazing lineup of speakers discussing how we can enable robots to acquire new skills and knowledge from humans continuously. Join us in person and on Zoom (info on our website)!
I spent some time playing with this at ICRA and was actually really impressed. What a cool piece of technology. We need affordable, widely-available tactile sensors to enable research
We have developed a new tactile sensor, called e-Flesh, with a simple working principle: measure deformations in 3D printable microstructures. Now all you need to make tactile sensors is a 3D printer, magnets, and magnetometers! 🧵