Multi-Modal Deep Feature Integration for Alzheimer’s Disease Staging

1 minute read

Published in IEEE BIBM conference, 2023

Alzheimer’s disease (AD) is one of the leading causes of dementia and 7th leading cause of death in the United States. The provisional diagnosis of AD relies on comprehensive examinations, including medical history, neurological and psychiatric examinations, cognitive assessments, and neuroimaging studies. Integrating diverse sets of clinical data, including electronic health records (EHRs), medical imaging, and genomic data, enables a holistic view of AD staging analysis. In this study, we propose an end-to-end deep learning architecture to jointly learn from magnetic resonance imaging (MRI), positron emission tomography (PET), EHRs, and genomics data to classify patients into AD, mild cognitive disorders, and controls. We conduct extensive experiments to explore different feature-level and intermediate-level fusion methods. Our findings suggest intermediate multiplicative fusion achieves the best stage prediction performance on the external validation dataset. Compared with unimodal baselines, we can observe that integrative approaches that leverage all four modalities demonstrate superior performance to baselines reliant solely on one or two modalities. In an age-wise comparison, we observe a unique pattern that all fusion methods exhibited superior performance in the earlier age brackets (50-70 years), with performance diminishing as the age group advanced (70-90 years). The proposed integration framework has the potential to augment our understanding of disease diagnosis and progression by leveraging complementary information from multimodal patient data.

pdf paper

Recommended citation: