SEEG Activity Varies During Facial Emotion Processing
Abstract number :
3.45
Submission category :
11. Behavior/Neuropsychology/Language / 11A. Adult
Year :
2022
Submission ID :
2232977
Source :
www.aesnet.org
Presentation date :
12/5/2022 12:00:00 PM
Published date :
Nov 22, 2022, 05:29 AM
Authors :
Kaitlyn Davis, MA – University of Alabama at Birmingham (UAB); Adam Goodman, PhD – University of Alabama at Birmingham; Daniel Janko, BS – University of Alabama at Birmingham; John Magnotti, PhD – University of Pennsylvania; Zhengjia Wang, PhD – University of Pennsylvania; Jerzy Szaflarski, MD, PhD – University of Alabama at Birmingham
This is a Late Breaking abstract
Rationale: Facial emotion processing is critical for successful social interactions. Adaptive responses to such emotional information are enabled by a fronto-limbic network. Increased theta band activity in this network has been linked to facial emotion processing, yet the extent to which this potential mechanism varies across regions and emotions remains unclear. The goal of this study was to further clarify the role of theta activity in this process. We utilized stereo-EEG (sEEG) to test the hypothesis that theta amplitude differs between happy, sad, fearful, and neutral facial expressions within the amygdala, hippocampus, insula, and anterior cingulate cortex (ACC) network.
Methods: A total of 22 patients undergoing sEEG monitoring (2048 Hz) completed an implicit Emotional Faces Task. This task consisted of 120 static face images presented for 2 s on a laptop monitor counterbalanced for race, sex, and emotional expression with an inter-trial interval jittered between 1000 and 5000 ms. Participants chose biological sex of the presented face using a keyboard during sEEG recording. Only trials in which participants correctly identified the sex of the face were included for further analysis. Electrodes were localized using iElectrodes in MATLAB. R Analysis and Visualization of iEEG (RAVE) was used for signal processing and analysis. Standard preprocessing included notch filters (60 Hz, 120 Hz, 180 Hz), 200 Hz downsampling, Morlet wavelet transforms, and common average referencing. Data were visually inspected to remove artifact. Trials above 4SD were removed after visual confirmation. Trial-averaged analyses were used to examine normalized amplitude in theta (4-7 Hz) for each emotion across the 2 s stimulus duration relative to prestimulus baseline. This was done for each ROI through a linear mixed-effects model (LME) with Emotion as a fixed effect and Subject and Electrode as random effects. Post-hoc pairwise comparisons were used to determine differences between emotions, and false discovery rate correction was applied to account for multiple comparisons (α= 0.05).
Results: Seventeen of 22 participants were included in the analyses (5 excluded due to artifact, seizures during testing, or missing data). Pairwise comparisons revealed differences in Emotion for each ROI. The amygdala showed higher amplitude for Fearful than Happy or Neutral faces (ps < 0.05). The hippocampus showed higher amplitude for Neutral than Happy or Fearful faces (ps≤ 0.01). The insula showed higher amplitude for Sad than Happy, Fearful, or Neutral faces (ps < 0.01). The ACC showed no significant differences between emotions, but responses to all emotions were different from 0 (ps < 0.001).
Behavior