Raunaq Bhirangi
@Raunaqmb
Teaching robots to feel touch at NYU. I make tactile sensors and now you can make them too: http://e-flesh.com! Other flavors: http://reskin.dev, http://any-skin.github.io
Tactile sensing is gaining traction, but slowly. Why? Because integration remains difficult. But what if adding touch sensors to your robot was as easy as hitting “print”? Introducing eFlesh: a 3D-printable, customizable tactile sensor. Shape it. Size it. Print it. 🧶👇
It is difficult to get robots to be both precise and general. We just released a new technique for precise manipulation that achieves millimeter-level precision while being robust to large visual variations. The key is a careful combination of visuo-tactile learning and RL. 🧵👇
Thanks @Stone_Tao ! Glad to see this sensor being made by the community!
this is really cool and readily popular / reproducible enough to the point i’m finding people making the sensor in the depths of chinese wechat groups the attention to reproducibility from Lerrel’s lab is incredible and something I often strive to achieve
We have developed a new tactile sensor, called e-Flesh, with a simple working principle: measure deformations in 3D printable microstructures. Now all you need to make tactile sensors is a 3D printer, magnets, and magnetometers! 🧵
🚀 With minimal data and a straightforward training setup, our VisualTactile Local Policy (ViTaL) fuses egocentric vision + tactile feedback to achieve millimeter-level precision & zero-shot generalization! 🤖✨ Details ▶️ vitalprecise.github.io
Current robot policies often face a tradeoff: they're either precise (but brittle) or generalizable (but imprecise). We present ViTaL, a framework that lets robots generalize precise, contact-rich manipulation skills across unseen environments with millimeter-level precision. 🧵
This might be the fastest we've gone from releasing a sensor to someone making their OWN copy of it! Hope eFlesh continues lowering the barrier for roboticists to use tactile sensors.
this is really cool and readily popular / reproducible enough to the point i’m finding people making the sensor in the depths of chinese wechat groups the attention to reproducibility from Lerrel’s lab is incredible and something I often strive to achieve
What would a World Model look like if we start from a real embodied agent acting in the real world? It has to have: 1) A real, physically grounded and complex action space—not just abstract control signals. 2) Diverse, real-life scenarios and activities. Or in short: It has to…
We shipped a robot on device and brought to RSS! Please come and check it out 🤖🦾
You don’t need a lab to give your robot touch. 🖨️ Just a printer. [Paper & code ⬇️] eFlesh is a 3D-printable, customizable tactile sensor built for makers and researchers. No fancy lab setup, no expensive gear… just a $5 bill of materials, a hobbyist printer, and a simple…
Making touch sensors has never been easier! Excited to present eFlesh, a 3D printable tactile sensor that aims to democratize robotic touch. All you need to make your own eFlesh is a 3D printer, some magnets and a magnetometer. See thread 👇and visit e-flesh.com
Tactile sensing is gaining traction, but slowly. Why? Because integration remains difficult. But what if adding touch sensors to your robot was as easy as hitting “print”? Introducing eFlesh: a 3D-printable, customizable tactile sensor. Shape it. Size it. Print it. 🧶👇
Tactile sensing is gaining traction, but slowly. Why? Because integration remains difficult. But what if adding touch sensors to your robot was as easy as hitting “print”? Introducing eFlesh: a 3D-printable, customizable tactile sensor. Shape it. Size it. Print it. 🧶👇