Auditory Language Event Related Potentials Recordings From Invasive EEG Recordings
Abstract number :
2.188;
Submission category :
3. Clinical Neurophysiology
Year :
2007
Submission ID :
7637
Source :
www.aesnet.org
Presentation date :
11/30/2007 12:00:00 AM
Published date :
Nov 29, 2007, 06:00 AM
Authors :
J. Taki1, D. R. Nair1
Rationale: Spoken language is considered to be first perceived as sounds by the auditory cortex and then sequentially processed by various regions of the language network. Recent neuroimaging or electrophysiological studies revealed that auditory language processing involves the classical language areas as well as some additional areas in the lateral temporal cortex mainly in the dominant hemisphere. However the precise localizations of brain areas involved in processing spoken language differ among studies and the temporal sequence of activation of these areas are still unknown. The objective of our study is to study brain areas involved in spoken language processing and its temporal activation pattern by event-related potentials (ERPs) in patients with chronically implanted subdural electrodes for epilepsy surgery.Methods: Three patients with subdural grids over the left periSylvian region were studied. The subjects were asked to listen to the spoken words or non-word control sounds, which were made by backword recording of words. The subjects were asked to distinguish between these stimuli with a button press using their index finger ipsilateral to the grid implantation. EEG and electrical signals from auditory presentation device were simultaneously recorded and averaged, during off-line analysis, time-locked to the stimulus onset. Event-related frequency analysis was also performed.Results: ERPs were seen in the superior and middle temporal gyrus, posterior part of the inferior frontal gyrus and posterior part of the superior temporal gyrus. Event-related synchronization (ERS) from beta to gamma frequency range was seen in the posterior part of the inferior frontal gyrus and the posterior part of the superior temporal gyrus. These activities showed no clear difference between word and non-word control stimuli. In one subject, larger negative ERP was seen in the supramargical gyrus for word stimuli as compared to non-word stimuli, and in another patient larger negative ERP was seen in supramarginal gyrus, extending to postcentral gyrus for non-word stimuli. Conclusions: These results may suggest that complex sounds, regardless of word or non-word, are processed similarly in Broca’s area, Wernicke’s area, superior and middle temporal gyrus. Differentiation of language sounds may be proceeded by other cortical areas such as the dominant supramarginal gyrus. However, studies with larger number of subjects would be necessary using this technique to detect subtle difference of processing by the language network between spoken words and non-words.
Neurophysiology