Ad Hoc UI: On-the-fly Transformation of Everyday Objects Into Tangible 6DOF Interfaces for AR

Interaction with AR content is often limited on smartphones and AR glasses due to the lack of reliable hand tracking or dedicated 6DoF controllers. The narrow field-of-view screen also makes it challenging to discover digital content -- more severely when input is limited to on-device touch. We aim to improve usability in AR interaction on lightweight AR devices by allowing users to seamlessly anchor input and digital content to uninstrumented everyday objects that are discoverable outside the boundaries of the screen. Recent advances in AR/VR technologies, real-time environmental tracking capabilities, and lightweight machine learning algorithms, make it possible to track most everyday objects with minimal acquisition time without dedicated fiducial markers. Leveraging extensive research in Tangible User Interfaces, this creates new opportunities for opportunistic interfaces using everyday objects. This paper explores the design space of opportunistic interfaces and interaction techniques to transform everyday objects into interactive widgets instantly. We present a design space identifying key aspects and challenges to explore for opportunistic interfaces. We further introduce Ad hoc UI, a prototyping toolkit that empowers users with on-the-fly tracking of everyday objects, 6DoF actions, audio activation, and mid-air gestures. We provide an open-source implementation to help AR researchers and practitioners effortlessly integrate opportunistic interfaces into their AR experiences and amplify the expression of their creative vision.

Note: the full-version of the paper was not accepted by the CHI / UIST community and I do not have much time to continue polishing this paper. The CHI / UIST community thinks that the conceptual delta to OmniTouch is quite small and technical novelty is limited here. To reproduce the results of our currently published paper in Unity, one can use the opensourced Live Transcribe from Google for voice input, image tracker, and OpenCV. Ping me if you need further assistance. I'd like to open source when I have more cycles and this project gets higher priority.


teaser image of Opportunistic Interfaces for Augmented Reality: Transforming Everyday Objects Into Tangible 6DoF Interfaces Using Ad Hoc UI

Opportunistic Interfaces for Augmented Reality: Transforming Everyday Objects Into Tangible 6DoF Interfaces Using Ad Hoc UI

Extended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems (CHI), 2022.
Keywords: augmented reality, everyday objects, tangible user interface, 3D user interface, 6 DoF, spatial interaction, markerless tracking, tangible interaction, hand gestures, XR interaction


Opportunistic Interfaces for Augmented Reality: Transforming Everyday Objects Into Tangible 6DoF Interfaces Using Ad Hoc UI


Cited By

  • Affordance-Based and User-Defined Gestures for Spatial Tangible Interaction. Proceedings of the 2023 ACM Designing Interactive Systems Conference.Weilun Gong, Stephanie Santosa, Tovi Grossman, Michael Glueck, Daniel Clarke, and Frances Lai. source | cite | search
  • Teachable Reality: Prototyping Tangible Augmented Reality With Everyday Objects by Leveraging Interactive Machine Teaching. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems.Kyzyl Monteiro, Ritik Vatsal, Neil Chulpongsatorn, Aman Parnami, and Ryo Suzuki. source | cite | search
  • Demonstrating BrightMarkers: Fluorescent Tracking Markers Embedded in 3D Printed Objects. Adjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology.Mustafa Doga Dogan, Raul Garcia-Martin, Patrick William Haertel, Jamison John O{\textquotesingle}Keefe, Raul Sanchez-Reillo, and Stefanie Mueller. source | cite | search
  • Stay In Touch