Venkatesh
@venkyp2000
Robot therapist @CILVRatNYU @nyuniversity ; Alum @IITIOfficial
Making touch sensors has never been easier! Excited to present eFlesh, a 3D printable tactile sensor that aims to democratize robotic touch. All you need to make your own eFlesh is a 3D printer, some magnets and a magnetometer. See thread 👇and visit e-flesh.com
Generalization needs data. But data collection is hard for precise tasks like plugging USBs, swiping cards, inserting plugs, and keying locks. Introducing robust, precise VisuoTactile Local (ViTaL) policies: >90% success rates from just 30 demos and 45 min of real-world RL.🧶⬇️
Thanks @Stone_Tao ! Glad to see this sensor being made by the community!
this is really cool and readily popular / reproducible enough to the point i’m finding people making the sensor in the depths of chinese wechat groups the attention to reproducibility from Lerrel’s lab is incredible and something I often strive to achieve
We have developed a new tactile sensor, called e-Flesh, with a simple working principle: measure deformations in 3D printable microstructures. Now all you need to make tactile sensors is a 3D printer, magnets, and magnetometers! 🧵
Teaching robots to learn only from RGB human videos is hard! In Feel The Force (FTF), we teach robots to mimic the tactile feedback humans experience when handling objects. This allows for delicate, touch-sensitive tasks—like picking up a raw egg without breaking it. 🧵👇
Imagine robots learning new skills—without any robot data. Today, we're excited to release EgoZero: our first steps in training robot policies that operate in unseen environments, solely from data collected through humans wearing Aria smart glasses. 🧵👇
Morning, #ICRA2025 @ieee_ras_icra! Bring something small 🍋🍑 and have our Robot Utility Model pick it up at our EXPO demo today from 1-5 PM, between hall A2/A3! Talk and poster is right before, 11:15-12:15 in room 411. Also, DM if you want to chat 🤖 for the messy, real world!
When life gives you lemons, you pick them up.
Super excited to present our open-source robot hand RUKA! I had a lot of fun working on this with @irmakkguzey and all our amazing collaborators: @BillyYYan, @AadhithyaIyer, Lisa Kondrich, @NXBhattasali, and @LerrelPinto. Check out our website at ruka-hand.github.io
Despite great advances in learning dexterity, hardware remains a major bottleneck. Most dexterous hands are either bulky, weak or expensive. I’m thrilled to present the RUKA Hand — a powerful, accessible research tool for dexterous manipulation that overcomes these limitations!
🚀 With minimal data and a straightforward training setup, our VisualTactile Local Policy (ViTaL) fuses egocentric vision + tactile feedback to achieve millimeter-level precision & zero-shot generalization! 🤖✨ Details ▶️ vitalprecise.github.io
Current robot policies often face a tradeoff: they're either precise (but brittle) or generalizable (but imprecise). We present ViTaL, a framework that lets robots generalize precise, contact-rich manipulation skills across unseen environments with millimeter-level precision. 🧵
Happy to see folks already reproducing eFlesh!
this is really cool and readily popular / reproducible enough to the point i’m finding people making the sensor in the depths of chinese wechat groups the attention to reproducibility from Lerrel’s lab is incredible and something I often strive to achieve
Interested in how generative AI can be used for human-robot interaction? We’re organizing the 2nd Workshop on Generative AI for Human-Robot Interaction (GenAI-HRI) at #RSS2025 in LA — bringing together the world's leading experts in the field. The workshop is happening on Wed,…
Tactile sensing is gaining traction, but slowly. Why? Because integration remains difficult. But what if adding touch sensors to your robot was as easy as hitting “print”? Introducing eFlesh: a 3D-printable, customizable tactile sensor. Shape it. Size it. Print it. 🧶👇
Everyday human data is robotics’ answer to internet-scale tokens. But how can robots learn to feel—just from videos?📹 Introducing FeelTheForce (FTF): force-sensitive manipulation policies learned from natural human interactions🖐️🤖 👉 feel-the-force-ftf.github.io 1/n
Robots need touch for human-like hands to reach the goal of general manipulation. However, approaches today don’t use tactile sensing or use specific architectures per tactile task. Can 1 model improve many tactile tasks? 🌟Introducing Sparsh-skin: tinyurl.com/y935wz5c 1/6
Despite great advances in learning dexterity, hardware remains a major bottleneck. Most dexterous hands are either bulky, weak or expensive. I’m thrilled to present the RUKA Hand — a powerful, accessible research tool for dexterous manipulation that overcomes these limitations!
The most frustrating part of imitation learning is collecting huge amounts of teleop data. But why teleop robots when robots can learn by watching us? Introducing Point Policy, a novel framework that enables robots to learn from human videos without any teleop, sim2real, or RL.