Uni-Vs.Multisensory Training
Pre- and post training EEG and behavioral data of Multi- vs. Unisensory training.
Data provided @ RawData folder.
EEG and event file data folder structure for each group is self-explanatory.
Subjects Y001, Y002 from the MusicPlast group and subjects Y023, Y025 from UniPlast group were measured using Brain Products' actiCAP-128 cap, while all other subjects were measured using Brain Products' R-Net.
EEG was recorded using a multisensory mismatch paradigm with the following conditions: Congruent, Incongruent, Auditory mismatch, visual mismatch.
The event files contain 2 types of triggers: the one defining the condition (1: Congruent, 2: Incongruent, 4: Auditory mismatch, 8: visual mismatch) and one defining the timepoint of presentation of the stimulus (64). Hence, timepoint 0 of stimulus presentation is the trigger 64 and the condition definition is the trigger prior to the 64.
Congruent condition has 3 triggers of 64, hence timepoint 0 of congruent is trigger 64 when previous trigger is also 64 and next one as well.
The file Cognitive.sav is an SPSS file containing D prime and cognitive evaluation data for each subject.
The file MIWithinBetweenBandsAllSubjectsAllTimesAllConditions2.m is the code for estimating GCMI on the single trial data. It is assumed that one has the GCMI toolbox and brainstorm. Once the hilbert transformed beamformer data are estimated for each trial from Brainstorm, they are exported and using the code provided, the connectivity is estimated for each trial. After that a simple mean is estimated for the trials of each subject and each condition to compile the hypernetwork of each subject, that will be subjected to statistical analysis via the Network Based Statistics Toolbox.