MIND: Empowering Mental Health Clinicians with Multimodal Data Insights through a Narrative Dashboard
Ruishi Zou
Shiyu Xu
Margaret E Morris
Jihan Ryu
Timothy D. Becker
Nicholas Allen
Anne Marie Albano
Randy Auerbach
Dan Adler
Lace M. Padilla
Dakuo Wang
Ryan Sultan
Xuhai Xu
Published at
Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems
2026
- Best Paper Honorable Mention
Abstract
Advances in data collection enable the capture of rich patient-generated data: from passive sensing (e.g., wearables and smartphones) to active self-reports (e.g., cross-sectional surveys and ecological momentary assessments). Although prior research has demonstrated the utility of patient-generated data in mental healthcare, significant challenges remain in effectively presenting these data streams along with clinical data (e.g., clinical notes) for clinical decision-making. Through co-design sessions with five clinicians, we propose MIND, a large language model-powered dashboard designed to present clinically relevant multimodal data insights for mental healthcare. MIND presents multimodal insights through narrative text, complemented by charts communicating underlying data. Our user study (N=16) demonstrates that clinicians perceive MIND as a significant improvement over baseline methods, reporting improved performance to reveal hidden and clinically relevant data insights (p\<.001) and support their decision-making (p=.004). Grounded in the study results, we discuss future research opportunities to integrate data narratives in broader clinical practices.