Paul Streli
@paulstreli
PhD student at SIPLAB, @ETH Zürich; previously @meta @imperialcollege @tiktok_uk @Meta PhD Research Fellow http://paulstreli.com
Introducing TouchInsight for detecting touch input from all ten fingers on any surface, purely based on vision-based hand tracking. w/ Mark Richardson, Fadi Botros, Shugao Ma, Robert Wang, and @cholz. paper: arxiv.org/abs/2410.05940 Join us for a demo at #UIST2024 next week!
#CHI2025 Human-in-the-loop optimization (HiLO) often starts from scratch — slow and inefficient. How can we leverage prior experience to boost HiLO? We introduce Continual HiLO (CHiLO): a framework where computational optimizers learn across users and improve over a lifetime.
Releasing EgoSim, a simulator for body-worn cameras. #NeurIPS2024 D&B EgoSim takes real mocap data (e.g., AMASS) and synthesizes multi-modal egocentric videos Plus: MultiEgoView, a real dataset from 6 GoPro cameras and ground-truth 3D poses during several activities (13 people)
🔬 My lab at @NorthwesternU has a new website! Visit spice-lab.org to see our latest research from CHI, ECCV & UIST 2024. 🎓 We're recruiting 2 PhD students (fully funded) for Fall 2025! Interested candidates, apply by Dec 1. Please RT to help spread the word!
new: MANIKIN reveals & overcomes SMPL-based limitations for full-body IK tasks via – a biomechanically-inspired neural-analytical formulation – a neural IK method for predicting body poses from end-effectors @eccvconf #ECCV2024 @cs_jiaxi_jiang @paulstreli @Xuejingluo @CSatETH