A Context-Assisted, Semi-Automated Activity Recall Interface Allowing Uncertainty
Published at
Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT)
2025
Abstract
Measuring activities and postures is an important area of research in ubiquitous computing, human-computer interaction, and personal health informatics. One approach that researchers use to collect large amounts of labeled data to develop models for activity recognition and measurement is asking participants to self-report their daily activities. Although participants can typically recall their sequence of daily activities, remembering the precise start and end times of each activity is significantly more challenging. ACAI is a novel, context-assisted Activity Annotation Inttivity Annotation Interface that enables participants to efficiently label their activities by accepting or adjusting system-generated activity suggestions while explicitly expressing uncertainty about temporal boundaries. We evaluated ACAI using two complementary studies: a usability study with 11 participants and a two-week, free-living study with 14 participants. We compared our activity annotation system with the current gold-standard methods for activity recall in health sciences research: 24PAR and its computerized version, ACT24. Our system reduced annotation time and perceived effort while significantly improving data validity and fidelity compared to both standard human-supervised and unsupervised activity recall approaches. We discuss the limitations of our design and implications for developing adaptive, human-in-the-loop activity recognition systems used to collect self-report data on activity.