Haitham Bou Ammar
@hbouammar
RL team leader @Huawei R&D UK H. Assistant Professor @UCL | Ex-@Princeton, Upenn (thou/thine)
🫠 Yo, after 1 year of really hard work, we have made a new cheap robot - 14K USD bimanual robot. It could do VR-teleoperation, imitation learning, connect with DeepSeek and VLMs to embodied planning tasks. The cool part is we try to make life easier for non-experts by creating…
Dear NeurIPS reviewers, please be reminded to delete the GPT prompts next time :)
The amazing folks at @EdinburghNLP will be presenting a few papers at ACL 2025 (@aclmeeting); if you're in Vienna, touch base with them! Here are the papers in the main track 🧵
To get you started, we have open-sourced our code and written many tutorials: Code: github.com/Robotics-Ark - Please star the repo 🙏 Tutorials: arkrobotics.notion.site/ARK-Home-22be0… Welcome to the Python revolution of robotics! Join us, because together we can do more! #AI #Robotics
After lots of hard work 🤠 , we have finally begun to make serious progress on Python-based robotics. (0) pip installable - really, you just need pip install ark-robotics (1) 4 widely-used robots are now fully supported (2) Sensors have been open-sourced, including (maybe the…
Everyone wants VLMs in robotics, but well, VLMs are not trained or grounded in robotics! So, we fix that by presenting ExPTeach: Experience is the Best Teacher, where we ground VLMs using self-generated Memory. It's remarkable to see how it can recover from failures by…
Want robot imitation learning to generalize to new tasks? Blindfold your human demonstrator! Best robotics paper at EXAIT Workshop #ICML2025 openreview.net/forum?id=zqfT2… Wait, why does this make sense? Read below!
Excited to announce a call for papers for our workshop on time series foundation models @NeurIPSConf, with an amazing lineup of speakers. See the post below for more details on where to submit (deadline: 22nd of August).
🚀 We are happy to organize the BERT²S workshop @NeurIPSConf 2025 on Recent Advances in Time Series Foundation Models. 🌐 berts-workshop.github.io 📜Submit by August 22 🎓Speakers and panelists: @ChenghaoLiu15 Mingsheng Long @zoe_piran @danielle_maddix @atalwalkar @qingsongedu
For machine learning in robotics 🤖 , it is essential to collect data, of course. I am happy to announce that we have now open-sourced Ark Interfaces, which allows you to collect data from your robot via, well, an interface 😍 We started by implementing a PS4 controller, which…

4 stages of training LLMs from scratch, clearly explained (with visuals):
Exactly. Thx
Prof Arora is simply pointing out to an often repeated point by current paradigm skeptics - "current capabilities are not evidence of complex abilities" when that's not the real claim. It's that capabilities are arriving quickly. Imagine throwing an IMO problem to GPT 3.5.
Many have asked me about Bourbaki and how it works at a high level. They said that although the blogs (huggingface.co/blog/hba123/bo…) are useful, they don't have time to read them. I can't blame them! Given that there are a zillion papers a day, who can? I just made a very simple…
I wrote a fun little article about all the ways to dodge the need for real-world robot data. I think it has a cute title. sergeylevine.substack.com/p/sporks-of-agi
Terrorism at its best.
Beyond the horrific massacres we have seen, our brethren in Sweida give us a peek into the extensive damage that has been done to Druze civilian homes, which were looted, burned and bombed!
Beyond the horrific massacres we have seen, our brethren in Sweida give us a peek into the extensive damage that has been done to Druze civilian homes, which were looted, burned and bombed!
et viola: pypi.org/project/ark-ro… #robotics2025 #roboticsurgery #Robotica

As promised, and after the request of many, we have managed to fit in the first live session about Ark that we will be giving on the 28th of July. #AI For those who are already in the messaging channel, all is done, no need to do anything :-D For those interested in…
