Preston Culbertson
@pdculbert
Assistant professor, @Cornell_CS. Learning, perception, and control for building more dynamic robots.
I'm so excited to announce I will be joining @Cornell_CS as an assistant professor next fall! I can't wait to launch my lab and start building robots with human-level dexterity and adaptability. 🤖🏃🤹

ICYMI: For #CoRL2024 we released a dataset of 3.5M (!) dexterous grasps, with multi-trial labels and perceptual data for 4.3k objects. Our takeaways: scale matters, and refining grasps > better sampling. Hoping our data can enable more vision-based grasps in hardware!
There have been many recent big grasping datasets, but few demos of real-world grasping using generative models. How do we achieve this? Introducing: Get a Grip (#corl2024)! We show that instead of generative models, discriminative models can attain sim2real transfer! 👀🧵👇
Congrats to @albert_h_li for winning Outstanding Paper today at the CoRL workshop for learning fine + dexterous manipulation!

Come to our NeRF-Shop today at #ICRA22! We may have some live demos 😊
We're excited to announce our @ieee_ras_icra workshop "Motion Planning with Implicit Neural Representations of Geometry!" We'll discuss the future of INRs -- like DeepSDFs, NeRFs, and more -- in robotics. Submissions due April 15. neural-implicit-workshop.stanford.edu
Check out our paper “Vision-Only Robot Navigation in a Neural Radiance World” at #ICRA2022! Come learn how to use NeRFs for trajectory optimization, state estimations, and MPC. We’ll be presenting at 4:15pm on Thursday in Room 124! More info: mikh3x4.github.io/nerf-navigatio…
Reminder: submissions are due to our @ieee_ras_icra workshop on Motion Planning with Implicit Neural Geometry by this Friday, April 15! Looking forward to an exciting discussion on integrating vision and learning with robot planning. More details: neural-implicit-workshop.stanford.edu