论文标题
部分可观测时空混沌系统的无模型预测
"Are you okay, honey?": Recognizing Emotions among Couples Managing Diabetes in Daily Life using Multimodal Real-World Smartwatch Data
论文作者
论文摘要
夫妻通常一起管理慢性疾病,管理层对患者及其浪漫伴侣造成了情感上的伤害。因此,认识到日常生活中每个伴侣的情绪可以提供对他们在慢性疾病管理中的情感健康的见解。当前,评估每个伴侣的情绪的过程是手动,时间密集和昂贵的。尽管夫妻之间存在着关于情感识别的作品,但这些作品都没有使用夫妻在日常生活中的互动中收集的数据。在这项工作中,我们收集了85小时(1,021个5分钟的样本)实际的多模式智能手表传感器数据(语音,心率,心率,加速度计和陀螺仪)以及自我报告的情感数据(n = 612)(n = 612),从26个伙伴(13对伴侣)管理Mellitus Mellitus Mellitus Mellitus Mellitus in Daulay Lives 2 pyepe 2 pype-2 pype-2。我们提取了生理,运动,声学和语言特征,以及训练有素的机器学习模型(支持向量机和随机森林),以识别每个伴侣的自我报告的情绪(价和唤醒)。我们的结果来自最佳模型(唤醒和价值分别为63.8%和78.1%的平衡精度)比机会更好,而且我们先前的工作也使用了来自实验室中的德语,基于瑞士的德语,基于瑞士的夫妻。这项工作有助于建立自动情绪识别系统,最终使伙伴能够监测他们在日常生活中的情绪,并能够提供干预措施以改善其情感幸福感。
Couples generally manage chronic diseases together and the management takes an emotional toll on both patients and their romantic partners. Consequently, recognizing the emotions of each partner in daily life could provide an insight into their emotional well-being in chronic disease management. Currently, the process of assessing each partner's emotions is manual, time-intensive, and costly. Despite the existence of works on emotion recognition among couples, none of these works have used data collected from couples' interactions in daily life. In this work, we collected 85 hours (1,021 5-minute samples) of real-world multimodal smartwatch sensor data (speech, heart rate, accelerometer, and gyroscope) and self-reported emotion data (n=612) from 26 partners (13 couples) managing diabetes mellitus type 2 in daily life. We extracted physiological, movement, acoustic, and linguistic features, and trained machine learning models (support vector machine and random forest) to recognize each partner's self-reported emotions (valence and arousal). Our results from the best models (balanced accuracies of 63.8% and 78.1% for arousal and valence respectively) are better than chance and our prior work that also used data from German-speaking, Swiss-based couples, albeit, in the lab. This work contributes toward building automated emotion recognition systems that would eventually enable partners to monitor their emotions in daily life and enable the delivery of interventions to improve their emotional well-being.