Skip to content

Latest commit

 

History

History
55 lines (55 loc) · 1.99 KB

2024-01-23-rosenman24a.md

File metadata and controls

55 lines (55 loc) · 1.99 KB
title abstract layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Pre-Training Transformers for Fingerprinting to Improve Stress Prediction in fMRI
We harness a Transformer-based model and a pre-training procedure for fingerprinting on fMRI data, to enhance the accuracy of stress predictions. Our model, called MetricFMRI, first optimizes a pixel-based reconstruction loss. In a second unsupervised training phase, a triplet loss is used to encourage fMRI sequences of the same subject to have closer representations, while sequences from different subjects are pushed away from each other. Finally, supervised learning is used for the target task, based on the learned representation. We evaluate the performance of our model and other alternatives and conclude that the triplet training for the fingerprinting task is key to the improved accuracy of our method for the task of stress prediction. To obtain insights regarding the learned model, gradient-based explainability techniques are used, indicating that sub-cortical brain regions that are known to play a central role in stress-related processes are highlighted by the model.
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
rosenman24a
0
Pre-Training Transformers for Fingerprinting to Improve Stress Prediction in fMRI
212
234
212-234
212
false
Rosenman, Gony and Malkiel, Itzik and Greental, Ayam and Hendler, Talma and Wolf, Lior
given family
Gony
Rosenman
given family
Itzik
Malkiel
given family
Ayam
Greental
given family
Talma
Hendler
given family
Lior
Wolf
2024-01-23
Medical Imaging with Deep Learning
227
inproceedings
date-parts
2024
1
23