Decoding EEG Data with Machine Learning
What it is:
A decoder of EEG brain signals using machine learning.
What we did:
- Built a configurable training pipeline supporting multiple neural network architectures (CNN, Hybrid CNN-Transformer, Dual-Stream, EEGNet variants, and T-TIME)
- Created automated hyperparameter optimization using Optuna with comprehensive search spaces
- Decoded brain data, results suggest distinct neural signals for different experimental conditions
- View project on GitHub
Read Full Project Details...
Decoding brain data
WE attempt to decode EEG brain signals using machine learning. The data is from projects at the Language and Cognitive Neuroscience Lab, Teachers College, Columbia University (Tang, 2022)
Approach
We built a machine learning system that can analyze EEG data and classify the numerical processing a subject. We experimented with multiple neural network architectures:
Raw EEG Processing:
- CNN (Convolutional Neural Networks): Direct processing of time-series EEG data
- EEGNet variants: Specialized architectures designed for EEG, including models with squeeze-and-excitation blocks
- Hybrid CNN-Transformer: Combines convolutional processing with attention mechanisms
- Dual-Stream: Processes both raw time-series and frequency-domain representations
Advanced Adaptation:
- T-TIME (Test-Time Adaptation): Allows models to adapt to individual subjects in real-time
Experimental “Tasks”
We tested our system on subjects’ data. The following are our “tasks”:
- Landing Digit Task: When participants were primed with any number and the stimulus number of dots is 1, 2, 3, 4, 5, or 6
- Decreasing Minus 1 Task: When participants were primed with a number followed by a stimulus one number lower
- Increasing or Decreasing: When participants were primed with a number number followed by a lower or higher number
Findings so far
The confusion matrices show that our models can distinguish between different numerical conditions above chance level, indicating that unique neural signatures exist for different types of numerical processing.
Technical
We have a unified training pipeline with:
- Leave-One-Subject-Out cross-validation for robust evaluation
- Automated hyperparameter optimization using Optuna
- Standardized reporting and visualization across all model types
- GPU acceleration (CUDA)
Future
We will make “tasks” for ratio, odd/even parity, and more. The framework extensible, as we can add new tasks, models, and analysis methods.