Jiachen presented a paper at AI for Aging in Place Symposium

Published on

Jiachen presented her work-in-progress paper at the AAAI Fall Symposium Series on AI for Aging in Place at Alexandria, VA.

This work-in-progress explores a multi-modal sensing system using smartphones, wearables, and voice assistants to study older adults’ daily lives, integrating data with visualization techniques and large language models in a 30-day pilot study.

Understanding the daily lives and routines of older adults is crucial to facilitate aging in place. Ubiquitous computing technologies like smartphones and wearables that are easy to deploy and scale, have become a popular method to collect comprehensive and longitudinal data for various demographics. Despite their popularity, several challenges persist when targeting the older adult population such as low compliance and hard to obtain feedback. In this work-in-progress paper, we present the design and development of a multi-modal sensing system that includes a phone, watch, and voice assistant. We are conducting an initial longitudinal study with one older adult participant over 30 days to explore how various types of data can be integrated through visualization techniques and large language models (LLMs). As a work-in-progress, we discussed our preliminary insights from the collected data, and conclude with a discussion of our future plans and directions for this research.