Charlie S. Burlingham
@csburlingham
Vision Science & AI @ Meta Reality Labs Research
🏖️ Headed to St. Pete Beach for VSS! Presenting new work Monday morning with Phil Guan, Ian Erkelens & Oliver Cossairt @RealityLabs. Come say hi! 📄 Abstract: visionsciences.org/presentation/?… 🔬 More from Meta Reality Labs: visionsciences.org/presentation/?… visionsciences.org/presentation/?… #VSS2025

🎉 New paper out! We show training improves motion categorization but doesn't reduce (or even worsens) misperceptions—explained via model combining efficient coding + implicit categorization + increased encoding precision journals.plos.org/ploscompbiol/a…
So in 2007, physicists wrote a paper that made the headlines: according to their calculations, human coin flips aren’t 50/50 - more like 51/49. Why is that, and did students in Amsterdam really flip 350,000 coins to find out? 🧵
I’m very excited to share that my graduate work is now online in @ScienceMagazine today! With generous help from my mentor @yuji_ikegaya and my amazing teammates, we investigated a top-down pathway for volitional heart rate regulation! science.org/doi/10.1126/sc…
All-day AR would benefit from AI models that understand a person's context & eye tracking could be key for task recognition. Yet past work - including our own research.facebook.com/publications/c… - hasn't found much added value from gaze in addition to computer vision & egocentric video 2/
Got Butterflies in your Stomach?😵💫I am super excited to share the first major study of my postdoc @visceral_mind! We report a multidimensional mental health signature of stomach-brain coupling in the largest sample to date 🧵👇biorxiv.org/content/10.110…
New article on unifying perceived magnitude and discriminability is out: pnas.org/doi/10.1073/pn… @EeroSimoncelli @lyndoryndo
PETMEI Workshop at @ETRA_conference 2024 kicked off with the keynote speech by @MichaelProulx from @RealityLabs. Insightful speech on pervasive eye tracking challenges for interactions in #ExtendedReality. petmei.org/2024/ #ETRA2024
Paper Session 1: Visual Attention @ETRA_conference 2024 just started. @olegkomo from @RealityLabs and @txst is now presenting their paper on "Per-Subject Oculomotor Plant Mathematical Models and the Reliability of Their Parameters" at #ETRA2024. Paper: doi.org/10.1145/3654701
.@csburlingham from @RealityLabs is now presenting their paper titled, "Real-World Scanpaths Exhibit Long-Term Temporal Dependencies: Considerations for Contextual AI for AR Applications" at PETMEI Workshop at #ETRA2024. Paper: doi.org/10.1145/364990…