Multi Modal Non Euclidean Brain Network Analysis With Community Detection and Convolutional Autoenco

Multi Modal Non Euclidean Brain Network Analysis With Community Detection and Convolutional Autoenco

Abstract:

Brain network analysis is one of the most effective methods for brain disease diagnosis. Existing studies have shown that exploring information from multimodal data is a valuable way to improve the effectiveness of brain network analysis. In recent years, deep learning has received more and more attention due to its powerful feature learning capabilities, and it is natural to introduce this tool into multi-modal brain networks analysis. However, it would face two challenges. One is that brain network is in non-Euclidean domain, so the convolution kernel in deep learning cannot be directly used to brain networks. The other is that most of existing multi-modal brain network analysis methods cannot extract full use of complementary information from distinct modalities. In this paper, we propose a multi-modal non-Euclidean brain network analysis method based on community detection and convolutional autoencoder, which can solve the above two problems simultaneously in one framework (M2CDCA). First, we construct the functional and structural brain network, respectively. Second, we design a multi-modal interactive community detection method that exploits structural modality to guide functional modality for detecting community structure, then readjusts the nodes distribution so that the adjusted brain network can preserve the potential community information and is more suitable for convolution kernels. Finally, we design a dual-channel autoencoder model with self-attention mechanism to capture hierarchical and highly non-linear features, then comprehensively use two modalities information for classification. We evaluate our method on an epilepsy dataset, the experimental results show that our method outperform several state-of-the-art methods.