Anirudh Bindiganavale Harish*, Peikun Guo*, Bhargav Ghanekar**, Diya Gupta**, Akilesh Rajavenkatanarayanan, Manoj Kumar Sharma, Maureen August, Akane Sano, Ashok Veeraraghavan
[Paper] | [Dataset] | [Project Page] | [Supplementary]
Official repository for the 2025 NeurIPS DB Track paper - CogPhys, a comprehensive multimodal dataset for assessing cognitive load through physiological measurements. The dataset combines both remote (non-contact) and contact-based sensing modalities to enable robust cognitive load estimation in various conditions.
Key Features:
- Dataset Size: 37 participants performing 6 tasks for 2 mins each. Total of 220 recordings (two trial were corrupted).
- Multiple Modalities: RGB, NIR, Thermal (above/below), Radar, and contact-based sensors
- Dual Tasks: Remote photoplethysmography (rPPG) for heart rate and respiration monitoring
- Cognitive Load Assessment: Physiological signals combined with cognitive task performance
- Built on rPPG-Toolbox: Compatible with the rPPG-Toolbox framework
🔥 Please star ⭐ this repo if you find it useful and cite our work if you use it in your research! 🔥
This dataset is for academic use only. Commercial usage is prohibited.
This dataset requires a signed Data Use Agreement. Please contact Anirudh (anirudhbh@rice.edu) for more information. You may also contact Ashok Veeraraghavan (vashok@rice.edu).
- [2025/10] Initial code release
- Python 3.8+
- PyTorch 1.9+
- CUDA (for GPU support)
git clone https://github.com/AnirudhBHarish/CogPhys.git
cd CogPhys
pip install -r requirements.txtThe CogPhys dataset can be accessed by [filling this form / contacting us at email].
This dataset (N=37) is organized as follows:
participant_XX/
├── task_YY/
│ ├── NIR/
│ ├── RGBD/
│ ├── PPG/
│ ├── Thermal_above/
│ ├── Thermal_below/
│ ├── Radar/
│ ├── Chest Band
│ ├── ECG
│ ├── Respiration
│ └── Accelerometer
The metadata.csv with demographic information and csv file with the cognitive load labels are also provided in the root directory.
The following files are not viable for unimodal analysis. The files are dropped from the dataset by the dataloader. The user does not need to drop these files manually. The dataloader will automatically drop these files, based on the input modality.
- RGB: v23_read
- NIR: v23_read, v19_still
- Respiration (includes thermal and radar): v9_still, v7_still, v5_still, v31_still, v30_still, v15_still, v12_still, v11_still, v10_still
- Radar = v26_read_rest, v31_still
- During training, we recommend training the thermal and radar models with just the
stillandrestsamples. Training is unstable with motion samples
We prove the checkpoints of the models we trained. Please check final_model_release/CogPhys
python main.py --config_file ./configs/train_configs/CogPhys_CONTRASTPHYS_BASIC.yaml# Open and run test_rppg.ipynb notebook
# Specify: config file, model checkpoint, output directorydataset/CogPhysFolds/CogPhys_all_Folds.pkl: Contains 4 folds. Each of the 37 particpants appears exactly once in a test set. Pooling the test set will give you all 37 participants.dataset/CogPhysFolds/CogPhys_data_gen_fold.pkl: Contain 1 fold, with no train and validation set. It contains all 37 participants in the test set and is useful when generating waveforms.
Setup:
- Modify
configs/train_configs/CogPhys_CONTRASTPHYS_BASIC.yaml - Set model save path
- Ensure input key is
['rgb_left']
Run:
python main.py --config_file ./configs/train_configs/CogPhys_CONTRASTPHYS_BASIC.yamlSetup:
- Modify
configs/train_configs/CogPhys_CONTRASTPHYS_BASIC.yaml - Set model save path
- Ensure input key is
['nir']
Run:
python main.py --config_file ./configs/train_configs/CogPhys_CONTRASTPHYS_BASIC.yamlSetup:
- Modify
configs/train_configs/CogPhys_Fusion_BASIC.yaml - Set model save path
- Ensure input key is
['rgb_left', 'nir']
Run:
python main.py --config_file ./configs/train_configs/CogPhys_Fusion_BASIC.yamlSetup:
- Modify
configs/train_configs/CogPhys_Resp_CONTRASTPHYS_BASIC.yaml - Set model save path
- Ensure input key is
['thermal above']
Run:
python main.py --config_file ./configs/train_configs/CogPhys_Resp_CONTRASTPHYS_BASIC.yamlSetup:
- Modify
configs/train_configs/CogPhys_Resp_CONTRASTPHYS_BASIC.yaml - Set model save path
- Ensure input key is
['thermal below']
Run:
python main.py --config_file ./configs/train_configs/CogPhys_Resp_CONTRASTPHYS_BASIC.yamlSetup:
- Modify
configs/train_configs/CogPhys_Resp_Radar_BASIC.yaml - Set model save path
- Ensure input key is
['radar']
Run:
python main.py --config_file ./configs/train_configs/CogPhys_Resp_Radar_BASIC.yamlSetup:
- Modify
configs/train_configs/CogPhys_Resp_Fusion_BASIC.yaml - Set model save path
- Ensure input key is
['thermal_below', 'thermal_above']
Run:
python main.py --config_file ./configs/train_configs/CogPhys_Resp_Fusion_BASIC.yamlStep 1: Generate Waveforms
- Run
test_resp.ipynbnotebook to save the waveforms - Inplace of using the regular pickle file use
CogPhys_data_gen_fold.pkl(it contains all folder as test) - Run
chunk_waveforms.ipynbto
Step 2: Train
- Modify
configs/train_configs/CogPhys_Resp_Waveform_BASIC.yaml - Set model save path
- Ensure input key is
['thermal_waveform', 'radar_waveform']
python main.py --config_file ./configs/train_configs/CogPhys_Resp_Waveform_BASIC.yamlNote: Replace CONTRASTPHYS with any other rPPG-Toolbox supported model such as:
PHYSMAMBADEEPHYSPHYSNETPHYSFORMERRHYTHMFORMER- and more...
To use a different model, simply change the model name in the config file name and parameters.
- Open
test_rppg.ipynbnotebook - Specify the following in the notebook:
- Config file of the pretrained model
- Final saved model checkpoint path
- Directory to save generated waveforms
- Run the rest of the notebook as-is to generate results
- Open
test_resp.ipynbnotebook - Specify the following in the notebook:
- Config file of the pretrained model
- Final saved model checkpoint path
- Directory to save generated waveforms
- Run the rest of the notebook as-is to generate results
- Open
test_resp_radar.ipynbnotebook - Specify the following in the notebook:
- Final saved model checkpoint path
- Directory to save generated waveforms
- Run the rest of the notebook as-is to generate results
Similar to Step 1 in the Waveform Fusion training (point 5 in Respiration Tasks):
- Run the rPPG notebooks (
test_rppg.ipynb) to save waveforms. - Run the respiration notebooks (
test_resp.ipynb) to save waveforms. - Run the
pool_signals.ipynbnotebook to pool the generated waveforms and save the pickle files required for cognitive load.
Note: pool_signals.ipynb takes a list of waveform files (can also be of length 1). If you are working a single fold (e.g., fold 0), use the CogPhys_data_gen_fold.pkl with the test noteboks to generate the waveforms for all the samples. Then run pool_signals.ipynb with the single wavform file. Alternately, if you are performing 4-fold validation, generate seperate waveforms for the test set of each fold. Then run pool_signals.ipynb with the list of all the waveform files to generate the pickle files need to run cognitive load estimation.
All code and instructions for cognitive load estimation are in the cognitive_load/ folder.
Please refer to cognitive_load/README.md for detailed instructions.
We provide 4 algorithmic baselines in the algorithmic_baselines/ folder. These are traditional unsupervised methods for rPPG estimation.
Implementation Details:
- Base functions are taken from the rPPG-Toolbox repository under
unsupervised_methods/ - All baselines are adapted for the CogPhys dataset
- Methods include traditional signal processing approaches (e.g., GREEN, ICA, CHROM, POS, etc.)
Usage: Please refer to the code and README in algorithmic_baselines/ for implementation details and usage instructions.
The rppg_bias_analysis.ipynb notebook can be run to obtain the bias numbers.
Required Inputs:
- Path to
metadata.csv - Folder containing the generated vitals (from the test script)
Steps:
- Open
rppg_bias_analysis.ipynbnotebook - Update the 2 paths mentioned above
- Run the notebook to obtain the bias metrics
If you use CogPhys in your research, please cite:
@inproceedings{
harish2025cogphys,
title={CogPhys: Assessing Cognitive Load via Multimodal Remote and Contact-based Physiological Sensing},
author={Anirudh Bindiganavale Harish and Peikun Guo and Diya Gupta and Bhargav Ghanekar and Akilesh Rajavenkatanarayan and MANOJ KUMAR SHARMA and Maureen Elizabeth August and Akane Sano and Ashok Veeraraghavan},
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2025},
url={https://openreview.net/forum?id=VJEcCMx16R}
}This work builds upon the rPPG-Toolbox. We thank the authors for their excellent framework.