Understanding the Daily Lives of Older Adults: Integrating Multi-modal Personal Health Tracking Data through Visualization and Large Language Models

Picture of Justin Steinberg
Justin Steinberg
Picture of Xiwen Li
Xiwen Li
Picture of Bingsheng Yao
Bingsheng Yao
Picture of Dakuo Wang
Dakuo Wang
Picture of Elizabeth Mynatt
Elizabeth Mynatt
Published at AAAI Fall Symposium Series on AI for Aging in Place 2024

Abstract

Understanding the daily lives and routines of older adults is crucial to facilitate aging in place. Ubiquitous computing technologies like smartphones and wearables that are easy to deploy and scale, have become a popular method to collect comprehensive and longitudinal data for various demographics. Despite their popularity, several challenges persist when targeting the older adult population such as low compliance and hard to obtain feedback. In this work-in-progress paper, we present the design and development of a multi-modal sensing system that includes a phone, watch, and voice assistant. We are conducting an initial longitudinal study with one older adult participant over 30 days to explore how various types of data can be integrated through visualization techniques and large language models (LLMs). As a work-in-progress, we discussed our preliminary insights from the collected data, and conclude with a discussion of our future plans and directions for this research.

Materials