EEG

Biosignal processing for automatic emotion recognition

Can we automatically detect changes in emotions given a user's biosignals? In this project, we used multimodal biosignal data to predict the target emotion of audiovisual stimuli.

Continue reading

Classifying ADHD subtypes and sex using multimodal data

ADHD subtypes are a controversial aspect of ADHD literature. Most subtypes classifications are based on behavioral and cognitive data but lack biomarkers. Using a multimodal dataset comprised of EEG data as well as self-reported symptoms and behavioral data, we tried to predict the DSM subtypes of each of our 96 participants. Since ADHD has been noted to present itself differently across sexes, we also tried to predict sex. At-rest eeg data and behavioral data proved to be poor predictors of the DSM subtypes. However, self-reported symptoms were a rich predictor of ADHD subtype. Additionally, predicting sex using EEG data yielded the highest decoding accuracies.

Continue reading

Combine EEG/MRI/Behavioral data-sets to learn more about Music/Auditory system

In this project I aim to combine data from different modalities (fMRI, EEG, and behavioral) to understand more about sound and music processing. My main focus in this project was to try to reproduce some of the results from a published paper starting form raw data.

Continue reading

Revealing similarities between deep learning models and brain EEG representations

Do artificial neural networks process visual images similarly to our brain? If so, how? In this project, we bridge deep learning and brain EEG signals as we aim to understand more about our ability to process common visual stimuli such as objects, faces, scenes and animals.

Continue reading