MIND: Empowering Mental Health Clinicians with Multimodal Data Insights through a Narrative Dashboard

Picture of Ruishi Zou
Ruishi Zou
Picture of Shiyu Xu
Shiyu Xu
Picture of Margaret E Morris
Margaret E Morris
Picture of Jihan Ryu
Jihan Ryu
Picture of Timothy D. Becker
Timothy D. Becker
Picture of Nicholas Allen
Nicholas Allen
Picture of Anne Marie Albano
Anne Marie Albano
Picture of Randy Auerbach
Randy Auerbach
Picture of Dan Adler
Dan Adler
Picture of Lace M. Padilla
Lace M. Padilla
Picture of Dakuo Wang
Dakuo Wang
Picture of Ryan Sultan
Ryan Sultan
Picture of Xuhai Xu
Xuhai Xu
Published at Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems 2026
  • Best Paper Honorable Mention

Abstract

Advances in data collection enable the capture of rich patient-generated data: from passive sensing (e.g., wearables and smartphones) to active self-reports (e.g., cross-sectional surveys and ecological momentary assessments). Although prior research has demonstrated the utility of patient-generated data in mental healthcare, significant challenges remain in effectively presenting these data streams along with clinical data (e.g., clinical notes) for clinical decision-making. Through co-design sessions with five clinicians, we propose MIND, a large language model-powered dashboard designed to present clinically relevant multimodal data insights for mental healthcare. MIND presents multimodal insights through narrative text, complemented by charts communicating underlying data. Our user study (N=16) demonstrates that clinicians perceive MIND as a significant improvement over baseline methods, reporting improved performance to reveal hidden and clinically relevant data insights (p\<.001) and support their decision-making (p=.004). Grounded in the study results, we discuss future research opportunities to integrate data narratives in broader clinical practices.

Materials