GOATUI: Generating Opportunistic Adaptive Tangible User Interfaces through Function Alignment in Augmented Reality

Under Review

Project Teaser Image

Abstract

Opportunistic Tangible User Interfaces leverage everyday objects to provide haptic feedback when users interact with spatial interfaces. Existing approaches of generating Opportunistic TUI rely on manual assignments or user-established adaptation rules and focus mainly on translating surface-based logic to the physical world, overlooking the rich interaction potentials of objects. Building on a formative study, we introduce GOATUI, an AR system that automatically generates TUIs by directly aligning the function of objects' embodied interactions with the goals of digital tasks. The system employs a VLM to perceive objects' interaction potentials within the environment and a vision-based pipeline to define each object's interaction realm. GOATUI leverages LLMs to consider the interactions' functional structure compatibility and functional purpose congruence with tasks to generate suitable TUIs. Our user study demonstrates that automatic TUI generation through function alignment reduces mental load and creates more expressive and immersive experiences, increasing users' willingness to engage with TUIs.