Jon Froehlich
@jonfroehlich
Professor of HCI at @uwcse | Director, Makeability Lab http://makeabilitylab.cs.uw.edu | Founder, http://projectsidewalk.org | #HCI+AI #AR #UrbanScience #Access
I used to❤️Twitter as a place to genuinely connect & learn from others. It's not that place anymore. I uninstalled Twitter in April'22 & have decreasingly used it. Other platforms haven't worked as a replacement; hoping Bluesky will be diff. Join me here: bsky.app/profile/jonfro…
We have extended our deadline to September 27th! Please join us and share your Urban+AI thought pieces, position papers, works-in-progress, demos, and more!🤗
📢Do you work on Urban + AI topics? How will AI impact the design of equitable & accessible cities, transit systems, and mapping tools? Submit to the 4th Annual Workshop on "The Future of Urban Accessibility: The Role of AI"; Deadline Sept 20th. CFP: accessiblecities.github.io/UrbanAccess202…
Today, we kick off hashtag#ASSETS2024 in beautiful St. John's, Newfoundland. Would love to meet up with you if you're also here!
Can your robot vacuum do this? Researchers in @UW @uwengineering #UWAllen’s @makeabilitylab adapted one to create MobiPrint, a 3D printer on wheels that maps a room and prints objects on location, on demand based on a user's needs. #UWinnovates #NSFfunded washington.edu/news/2024/10/2…
Thrilled to be heading to #ASSETS24 to present “Engaging with Children’s Artwork in Mixed Visual-Ability Settings”, done with my wonderful collaborators and advisors @wobbrockjo and @jonfroehlich! I will be presenting on Mon, Oct 28 as a part of Session 1A: Creativity 🖼️🎨 (1/7)
On this last day of #UIST2024, PhD student @XiaSu09 will be presenting our joint work w/Chang Xiao and Eunyee Koh at Adobe Research on SonifyAR, which uses LLMs to generate contextual sound effects in AR environments. Talk is 2-3:15PM in Allegheny 2. makeabilitylab.cs.washington.edu/project/Sonify…
Check out our AI-powered wearable AR system for low vision users, which recognizes and augments object affordance to support safe and efficient interactions! Led by amazing @jaewook_jae! #UIST2024
Hey hey! I'll be at #UIST2024 to present 𝗖𝗼𝗼𝗸𝗔𝗥 ☺️ CookAR is an AI-powered wearable AR prototype that augments kitchen tool affordances in near real-time for low vision users, enabling safer and more efficient tool interactions. (1/6)
An exciting Oct: 11 papers, posters, & demos at UIST, ASSETS, & VIS! 🎉 Congrats to our students & collaborators. Special rec to Jae Lee for a Best Paper Inclusion Award at UIST’24, Arnavi Chedda-Kothary for a Best Paper Nom at ASSETS’24, & Chu Li for our lab's 1st ever VIS paper
We are ecstatic to have you! 🎉🎈🎊 Welcome to @uwcse and the @makeabilitylab!
I am ecstatic to announce that I am starting my PhD in CS at the UW Allen School @uwcse. I'll be working with the @makeabilitylab and @jonfroehlich making cities more sustainable, accessible, and equitable. I'm incredibly grateful for this opportunity and excited to get to work!
Come be my colleague! We're hiring TWO tenure-track Assistant Professors at @UW_iSchool in AI, Data Science, and HCI 📊💻👩💻🌄 Link to apply: apply.interfolio.com/150031 Feel free to reach out with any questions!
Thrilled to receive an NIH-SCH award with amazing collaborators @YapengTian and @jonfroehlich on AI/AR-assisted Vision for people with low vision! With real-time scene interpretation techniques and multi-modal AR augmentations, we will support low vision users in various tasks.