Remi Cadene
@RemiCadene
AI for Robotics at @HuggingFace 🤗 Focusing on @LeRobotHF
Meet the game-changer: SO-100 🦾 Crafted by @therobotstudio and @huggingface 🤗 At 1/3 the cost and 2x the capabilities of our previous arms, it's the most accessible, high-performance robotic arm for $115. Easiest DIY at home! 1/🧵 Link and details in thread 👇

Generalization needs data. But data collection is hard for precise tasks like plugging USBs, swiping cards, inserting plugs, and keying locks. Introducing robust, precise VisuoTactile Local (ViTaL) policies: >90% success rates from just 30 demos and 45 min of real-world RL.🧶⬇️
Implemented @physical_int’s Real‑Time Chunking (RTC) on @huggingface’s SmolVLA in the @LeRobotHF repo! It noticeably reduces jerky motion compared with basic merge strategies during async inference!🧵1/
LeKiwi's got a new look🤯 We redesigned our mobile manipulator from the ground up with full Dynamixel integration 🦾 Check out the build instructions on our GitHub (linked below)👀 Huge thanks to @ROBOTISAmerica for sponsoring this project!
XLeRobot 0.2.3 (XLeVR) out! xlerobot.readthedocs.io • VR (Quest 3) Whole-body Control • Reads everything: head and hand poses, joysticks, and all the buttons • Minimal dependencies, web-based • Modular, can use it on other robots • Open-source, easy install, play in 20min
i love the simplicity of @LeRobotHF to interact with robots, especially for beginners like me there is one very huge problem: it's written in python, but i love js introducing: LeRobot.js interact with your robot directly in the browser
5 days after launch and we've sold $1M of Reachy Mini Steadily getting $50k orders per day following the release peak time to ramp up production!
4mo ago: We bought a used forklift & strapped an Ai kit to it 4mo later: We’re moving hundreds of pallets a day in a customers warehouse Yesterday I got 3 requests totaling 100+ forklifts We have to scale right now V2 coming soon
Quick update and announcement: 24h since we announced Reachy Mini and we're quickly approaching $500,000 in pre-orders (!) We knew this product was special (everyone on the team wanted one), but the response has been even more overwhelming than expected—thanks for the incredible…
Thrilled to finally share what we've been working on for months at @huggingface 🤝@pollenrobotics Our first robot: Reachy Mini A dream come true: cute and low priced, hackable yet easy to use, powered by open-source and the infinite community. Tiny price, small size, huge…
Happy to announce 🤗Datasets 4 ! we've added the most requested feature 👀 Introducing streaming data pipelines for Hugging Face Datasets ✨ With support for large, multimodal datasets in any standard file format, and with num_proc= for speed⚡
$120k worth of order in just a few hours! 🤗 - Cute robots will be present in our daily lives - Reachy mini is an important step forward - And it's open-source! youtube.com/watch?v=JvdBJZ…
Reachy Mini 🤗
Thrilled to finally share what we've been working on for months at @huggingface 🤝@pollenrobotics Our first robot: Reachy Mini A dream come true: cute and low priced, hackable yet easy to use, powered by open-source and the infinite community. Tiny price, small size, huge…
SmolVLM has been accepted to @COLM_conf 2025 🥳! See you in Montreal!
Introducing the smollest VLMs yet! 🤏 SmolVLM (256M & 500M) runs on <1GB GPU memory. Fine-tune it on your laptop and run it on your toaster. 🚀 Even the 256M model outperforms our Idefics 80B (Aug '23). How small can we go? 👀
We tested WSRL (Warm-start RL) on a Franka Robot, and it leads to really efficient online RL fine-tuning in the real world! WSRL learned the peg insertion task perfectly with only 11 minutes of warmup and *7 minutes* of online RL interactions 👇🧵
🚀 With minimal data and a straightforward training setup, our VisualTactile Local Policy (ViTaL) fuses egocentric vision + tactile feedback to achieve millimeter-level precision & zero-shot generalization! 🤖✨ Details ▶️ vitalprecise.github.io
Current robot policies often face a tradeoff: they're either precise (but brittle) or generalizable (but imprecise). We present ViTaL, a framework that lets robots generalize precise, contact-rich manipulation skills across unseen environments with millimeter-level precision. 🧵
Recording more @LeRobotHF datasets today. I’m using the Ryobi to “reset my environment” 😁
recorded my first @LeRobotHF dataset today 🦾🤖 I finally got my hands on it @ErikKaum 😊
Inspiring work :)
What would a World Model look like if we start from a real embodied agent acting in the real world? It has to have: 1) A real, physically grounded and complex action space—not just abstract control signals. 2) Diverse, real-life scenarios and activities. Or in short: It has to…
What would a World Model look like if we start from a real embodied agent acting in the real world? It has to have: 1) A real, physically grounded and complex action space—not just abstract control signals. 2) Diverse, real-life scenarios and activities. Or in short: It has to…