Irmak Guzey
@irmakkguzey
PhD student at @CILVRatNYU. Intern at @AIatMeta. On a mission to make robotic hands as dexterous as human ones! 🤖✋. (she/her)
Despite great advances in learning dexterity, hardware remains a major bottleneck. Most dexterous hands are either bulky, weak or expensive. I’m thrilled to present the RUKA Hand — a powerful, accessible research tool for dexterous manipulation that overcomes these limitations!
Generalization needs data. But data collection is hard for precise tasks like plugging USBs, swiping cards, inserting plugs, and keying locks. Introducing robust, precise VisuoTactile Local (ViTaL) policies: >90% success rates from just 30 demos and 45 min of real-world RL.🧶⬇️
Using grounded keypoints in the environment (a) enables human-to-robot transfer of two-fingered gripper policies collected with **only in-the-wild human data**, and (b) generalizes both spatially and to different objects! Check out this new work by my colleagues to learn more!
Imagine robots learning new skills—without any robot data. Today, we're excited to release EgoZero: our first steps in training robot policies that operate in unseen environments, solely from data collected through humans wearing Aria smart glasses. 🧵👇
Morning, #ICRA2025 @ieee_ras_icra! Bring something small 🍋🍑 and have our Robot Utility Model pick it up at our EXPO demo today from 1-5 PM, between hall A2/A3! Talk and poster is right before, 11:15-12:15 in room 411. Also, DM if you want to chat 🤖 for the messy, real world!
When life gives you lemons, you pick them up.
Super excited to present our open-source robot hand RUKA! I had a lot of fun working on this with @irmakkguzey and all our amazing collaborators: @BillyYYan, @AadhithyaIyer, Lisa Kondrich, @NXBhattasali, and @LerrelPinto. Check out our website at ruka-hand.github.io
Despite great advances in learning dexterity, hardware remains a major bottleneck. Most dexterous hands are either bulky, weak or expensive. I’m thrilled to present the RUKA Hand — a powerful, accessible research tool for dexterous manipulation that overcomes these limitations!
tendon-driven 3D-printed hand from @irmakkguzey and team at the @LerrelPinto lab. * costs $1300 to build, compact human-profile. * the tendons are actually off-the-shelf fish-line, super strong and never break. the plastic parts break before the tendons ever do. * mountable on…
Despite great advances in learning dexterity, hardware remains a major bottleneck. Most dexterous hands are either bulky, weak or expensive. I’m thrilled to present the RUKA Hand — a powerful, accessible research tool for dexterous manipulation that overcomes these limitations!
So excited for this!!! The key technical breakthrough here is that we can control joints and fingertips of the robot **without joint encoders**. Learning from self-supervised data collection is all you need for training the humanoid hand control you see below.
Despite great advances in learning dexterity, hardware remains a major bottleneck. Most dexterous hands are either bulky, weak or expensive. I’m thrilled to present the RUKA Hand — a powerful, accessible research tool for dexterous manipulation that overcomes these limitations!
🤖 Does VLA models really listen to language instructions? Maybe not 👀 🚀 Introducing our RSS paper: CodeDiffuser -- using VLM-generated code to bridge the gap between **high-level language** and **low-level visuomotor policy** 🎮 Try the live demo: robopil.github.io/code-diffuser/ (1/9)
e-Flesh is a deformable, under $5, 3D-printable tactile sensor that (a) can be any shape you want, (b) is robust against magnetic interference, and (c) is deformable! Led by @venkyp2000 and @Raunaqmb. I can't wait to use it myself and see others using it. Check it out!
Tactile sensing is gaining traction, but slowly. Why? Because integration remains difficult. But what if adding touch sensors to your robot was as easy as hitting “print”? Introducing eFlesh: a 3D-printable, customizable tactile sensor. Shape it. Size it. Print it. 🧶👇
Learning task-agnostic tactile representations is very valuable for dexterity! Check out this cool work by @akashshrm02 that explores this while integrating the history of tactile information. This enables highly dexterous tasks—like plug insertion with a giant hand! 😁
Robots need touch for human-like hands to reach the goal of general manipulation. However, approaches today don’t use tactile sensing or use specific architectures per tactile task. Can 1 model improve many tactile tasks? 🌟Introducing Sparsh-skin: tinyurl.com/y935wz5c 1/6
RUKA is warming up for our EXPO demo today @ICRA2025 with the help of our first-time teleoperators, @venkyp2000 and @LIUPEIQI2 🫰 Come try teleoperating RUKA yourself from 1–5 PM, at exhibit hall! 🧤 For more info before coming -> ruka-hand.github.io :) #ICRA2025 @anyazorin
Last week, we introduced RUKA — a fully open-sourced, 3D-printable humanoid hand. Today, we're excited to release the software stack as well: github.com/ruka-hand/RUKA! It comes with detailed instructions for: - Calibration - Control - Teleoperation ...and more! Check it out ✌️
Despite great advances in learning dexterity, hardware remains a major bottleneck. Most dexterous hands are either bulky, weak or expensive. I’m thrilled to present the RUKA Hand — a powerful, accessible research tool for dexterous manipulation that overcomes these limitations!
NYU researchers have introduced RUKA, an open‑source, tendon‑driven robotic hand with 15 DOF that costs only $1.3 k and can operate for 20 straight hours without any performance loss. It learns joint‑to-actuator and fingertip‑to‑actuator models from motion‑capture data.
New open-source, 3d-printable, tendon-driven robotic hand capable of a wide range of tasks that need strength and dexterity.
Despite great advances in learning dexterity, hardware remains a major bottleneck. Most dexterous hands are either bulky, weak or expensive. I’m thrilled to present the RUKA Hand — a powerful, accessible research tool for dexterous manipulation that overcomes these limitations!