A Multimodal Wearable and Mobile Health Platform to Assess Bidirectional Relationships Between Epilepsy and Neuropsychiatric Comorbidities
Abstract number :
881
Submission category :
2. Translational Research / 2B. Devices, Technologies, Stem Cells
Year :
2020
Submission ID :
2423215
Source :
www.aesnet.org
Presentation date :
12/7/2020 1:26:24 PM
Published date :
Nov 21, 2020, 02:24 AM
Authors :
Lydia Wheeler, Mayo Clinic; Vaclav Kremen - Mayo Clinic; Tal Pal Attia - Mayo Clinic; Mona Nasseri - Mayo Clinic; Benjamin Brinkmann - Mayo Clinic; Gregory Worrell - Mayo Clinic;
Rationale:
Neuropsychiatric comorbidities, particularly affective disorders such as depression and anxiety, impact more than 30%-70% of the 65 million people with epilepsy (PWE). Affective comorbidities are independent predictors of increased severity and frequency of seizures, reduced tolerance of pharmacotherapy, prolonged post-ictal states, poor quality of life, and increased mortality risk. Studies show that incorporation of frequent affect measurements into therapeutic decisions results in fewer hospital visits, reduced medication use, and improved quality of life. However, current affect measurements are sparse and rarely used to individualize patient therapy. There is a clinical need for new and validated tools which allow for continuous measurement of affective states. Using wearable technology, phones, and deep learning, we have developed a real-time, continuous, low-power platform for automated affect recognition and prediction.
Method:
We describe a mobile health platform for affect recognition and prediction which integrates with the Mayo Epilepsy Personal Assistant Device (EPAD). Smartwatch sensors worn on the nondominant wrist capture electrodermal activity (4Hz), photoplethysmography (64Hz); temperature (4Hz), and 3-axis accelerometer data (32Hz). Interactive assessments are administered by watch and mobile; features are also generated from passive data such as location. We developed a multimodal transformer for multimodal sequential learning utilizing the prior features as input and affect measures (valence, arousal, categorical mood and emotion, and immediate mood scaler (IMS-12)) as output. Our platform also includes a seizure diary, medication manager, and symptom tracking options.
Results:
The user interface was developed for iOS and Android. Dense behavioral inputs and physiological signals from the patient are acquired through active and passive interaction with the mobile and wearable. Data synchronization occurs between devices and a cloud repository using BLE and WiFi. Comprehensive testing on public datasets as proof-of-concept show that our algorithm overcomes multimodal challenges such as inherent data nonalignment due to variable sampling rates and long-range dependencies between elements across modalities. Incorporation of intra- and intermodal dynamics allows for an increased number of self-attentions on a low-resource device.
Conclusion:
We have developed a wearable and mobile platform which allows for the real-time, continuous monitoring of affect in PWE through multimodal sensing. Our novel algorithm for affect recognition and prediction is superior to current state-of-the-art methods by a significant margin and deployable on a mobile phone. The ability to decode affect over time could enable closed-loop systems to treat neuropsychiatric disorders. Next steps are to broaden this platform to include cognitive comorbidities and integrate these understandings into seizure prediction models. Investigation of relationships between comorbidity dynamics and epilepsy may be the key to individualizing patient therapy, augmenting seizure forecasting, and improving patient outcomes.
Funding:
:This study was funded by the National Institutes of Health (UH2/UH3-NS095495).
Translational Research