Ken Pfeuffer
@KenPfeuffer
Prof @csaudk, #HCI #XR, eye-gaze UI. sharing things from research etc
What's science saying on the UI of the @Apple's new HMD? In 2021-22 we studied the techniques of Gaze+Pinch (#VisionPro) vs Handray (#HoloLens #MetaQuest). Result: Gaze+Pinch is faster & less physical effort if targets fit the eyetracker accuracy. More details & papers below.

I am looking for excellent students to do PhD work with me on spatial and multimodal human-AI collaboration. Deadline soon! Read more here: phd.nat.au.dk/for-applicants…
Adding two more open PhD positions at the department! Don't miss out on this opportunity to work with @jensemildk, @aslanix, @XRStefanie, @tobiaslanglotz, @KenPfeuffer, @DavideMottin or @aroraakhilcs 🎉 Check out the projects and apply before Aug 1 ➡️ phd.nat.au.dk/for-applicants…
= ISMAR GEMINI Workshop DDL Extended to 25 July = There is still time to submit to our Workshop on Gaze Interaction in XR at ISMAR 2025. Welcome to participate if you have a vision or work-in-progress that you would like to discuss with us! @KenPfeuffer @HansGellersen
CFP: 3rd Workshop on Gaze and Eye Movement in Interaction in XR (GEMINI) at ISMAR 2025 Join us on this exciting journey as we collectively reshape the future of XR interaction: sites.google.com/view/w-gemini/…
Improving hand ray input for XR generally has been a trade off between speed, ergonomics, and accuracy. But what if we can break away from go-go CD gains, clutch and other usual tricks using instead a very ecological approach.
Looking forward to giving an online talk at Google Research tomorrow on UI evolution and past/future Eye-Hand Symbiosis research. It’s an incredible honour. Pretty wild I’m still deep into this decade-long eye-hand quest-just can't get enough! More to come in 2025!
Adding two more open PhD positions at the department! Don't miss out on this opportunity to work with @jensemildk, @aslanix, @XRStefanie, @tobiaslanglotz, @KenPfeuffer, @DavideMottin or @aroraakhilcs 🎉 Check out the projects and apply before Aug 1 ➡️ phd.nat.au.dk/for-applicants…
What light is real? Demo where you control both real light beams (like Philips Hue) and virtual ones in #AR. Real and virtual lights blend so naturally, they almost become indistinguishable. Kudos to our students Mads, Oliver, Magnus for going the extra mile!
XR Light Designer: A Standalone Application for Lighting Design and Pre-Visualization on the Apple Vision Pro A Master Thesis by Mads Kristian Steen Vejrup, Oliver Due Nielsen, and Magnus Frisk, Aarhus University @csaudk, 2025, supported by P1 (Danish AI Pioneer Center). This…
XR Light Designer: A Standalone Application for Lighting Design and Pre-Visualization on the Apple Vision Pro A Master Thesis by Mads Kristian Steen Vejrup, Oliver Due Nielsen, and Magnus Frisk, Aarhus University @csaudk, 2025, supported by P1 (Danish AI Pioneer Center). This…
Fully-funded #PhD positions in #ComputerScience at @AarhusUni! Apply for one of 4 projects in #HCI, #AI, #XR & #Cybersecurity — or propose your own via the general call. Deadline: Aug 1, 2025 🔗 phd.nat.au.dk/for-applicants…
We will be showing where we are at the midway point of the Authoring in XR year on Monday. Here is one of 3 demos. Join us? youtu.be/NEhVTuzo21Y futuretextlab.info/2025-schedule/
I was fired by Apple today. Me and my design team have spent the last 18 months tirelessly testing different levels of gaussian blur on backgrounds when foreground elements are in focus. If you are looking for experts in the blur, glass liquid, grass or fur UI space, lmk.
But you can’t even see your phone screen? I would have liked to see iPhone and iPad mirroring so you can control them with gestures and eye tracking.
iphone released in 2007 macbook released in 2006 2025, still no adequate eye tracking we should not have to use our fingers to navigate a device...
Eye tracking is no longer just a behind-the-scenes feature – it’s becoming central to how we learn, train, and perform in XR. In our latest blog, we explore how eye tracking at Varjo has evolved from a tool for powering foveated rendering into to a key enabler of breakthrough…
Check out this excerpt from a talk by my colleague and long time VR scientist/pioneer @Stanford Jeremy Bailenson. He explains that biological humans reflexively accommodate AR humans and show VR “manners”. Interesting.
When I got up this morning I had no clue I would be invited to see Jeremy Bailensen speak at Stanford University. He runs Stanford’s Holodeck research. XR/AR/VR. Here he shares the latest research. Exclusive to X. This talk changed several things I thought I knew about the…
Had the pleasure of speaking for The Future of Text on the evolution of XR input—from mouse, touch to XR and hybrid gaze interfaces. Explored input as an evolving organism:👁️✋ Post: bit.ly/4jb5pzG Video: bit.ly/4do5SNL Thanks to @liquidizer & @dgrigar!
It’s in my hands🎉Honored to get the @sigchi Special Recognition for Industry-Academia collab at the #CHI2025 awards dinner a few nights ago, shared with an incredible team. And to have been in a room w/ legends like Wobbrock, Churchill, Zhai, Ishii&more. 🙌 Crazy stuff!


